User Prompts
Saving prompts to Trubrics
Analysing user prompts is essential to building AI models that aligns with your users. Upon creating an account with Trubrics, you can start logging prompts & model generations to the default
project. Create different projects to organise your prompts (we recommend one project per use case).
Install Trubrics with:
Set Trubrics email
and password
as environment variables:
and push some user prompts to the default
project:
import os
from trubrics import Trubrics
trubrics = Trubrics(
project="default",
email=os.environ["TRUBRICS_EMAIL"],
password=os.environ["TRUBRICS_PASSWORD"],
)
prompt = trubrics.log_prompt(
config_model={{
"model": "gpt-3.5-turbo",
"prompt_template": "Tell me a joke about {{animal}}",
"temperature": 0.7,
}},
prompt="Tell me a joke about sharks",
generation="Why did the shark cross the ocean? To get to the other side."
)
trubrics.log_prompt()
arguments
Log user prompts to Trubrics.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
config_model |
dict
|
model configuration with fields "model", "prompt_template", "temperature" |
required |
prompt |
str
|
user prompt to the model |
required |
generation |
str
|
model generation |
required |
user_id |
Optional[str]
|
user id |
None
|
session_id |
Optional[str]
|
session id, for example for a chatbot conversation |
None
|
tags |
list
|
feedback tags |
[]
|
metadata |
dict
|
any feedback metadata |
{}
|
Source code in trubrics/platform/__init__.py
Saving prompts from Streamlit apps
The FeedbackCollector
Streamlit integration inherits from the Trubrics
object, meaning that you can log prompts in the same way directly from the FeedbackCollector
. For more information on this, see the Streamlit integration docs.
Analyse prompts in Trubrics
Various filters allow AI teams to explore user prompts in Trubrics, and export them to csv.
[Beta] Ask an LLM about your prompts
Try asking an LLM a question about the user prompts in your dataset. For example, "What prompts ask for shark jokes?". This feature is in beta, so please give us feedback on the model generations!
Created: November 15, 2023