Skip to content

User Prompts

Saving prompts to Trubrics

Analysing user prompts is essential to building AI models that aligns with your users. Upon creating an account with Trubrics, you can start logging prompts & model generations to the default project. Create different projects to organise your prompts (we recommend one project per use case).

Install Trubrics with:

pip install trubrics

Set Trubrics email and password as environment variables:

export TRUBRICS_EMAIL="trubrics_email"
export TRUBRICS_PASSWORD="trubrics_password"

and push some user prompts to the default project:

import os
from trubrics import Trubrics

trubrics = Trubrics(
    project="default",
    email=os.environ["TRUBRICS_EMAIL"],
    password=os.environ["TRUBRICS_PASSWORD"],
)

prompt = trubrics.log_prompt(
    config_model={{
        "model": "gpt-3.5-turbo",
        "prompt_template": "Tell me a joke about {{animal}}",
        "temperature": 0.7,
    }},
    prompt="Tell me a joke about sharks",
    generation="Why did the shark cross the ocean? To get to the other side."
)

trubrics.log_prompt() arguments

Log user prompts to Trubrics.

Parameters:

Name Type Description Default
config_model dict

model configuration with fields "model", "prompt_template", "temperature"

required
prompt str

user prompt to the model

required
generation str

model generation

required
user_id Optional[str]

user id

None
session_id Optional[str]

session id, for example for a chatbot conversation

None
tags list

feedback tags

[]
metadata dict

any feedback metadata

{}
Source code in trubrics/platform/__init__.py
def log_prompt(
    self,
    config_model: dict,
    prompt: str,
    generation: str,
    user_id: Optional[str] = None,
    session_id: Optional[str] = None,
    tags: list = [],
    metadata: dict = {},
) -> Optional[Prompt]:
    """
    Log user prompts to Trubrics.

    Parameters:
        config_model: model configuration with fields "model", "prompt_template", "temperature"
        prompt: user prompt to the model
        generation: model generation
        user_id: user id
        session_id: session id, for example for a chatbot conversation
        tags: feedback tags
        metadata: any feedback metadata
    """
    config_model = ModelConfig(**config_model)
    prompt = Prompt(
        config_model=config_model,
        prompt=prompt,
        generation=generation,
        user_id=user_id,
        session_id=session_id,
        tags=tags,
        metadata=metadata,
    )
    auth = get_trubrics_auth_token(
        self.config.firebase_api_key,
        self.config.email,
        self.config.password.get_secret_value(),
        rerun=expire_after_n_seconds(),
    )
    res = save_document_to_collection(
        auth,
        firestore_api_url=self.config.firestore_api_url,
        project=self.config.project,
        collection="prompts",
        document=prompt,
    )
    if "error" in res:
        logger.error(res["error"])
        return None
    else:
        logger.info("User prompt saved to Trubrics.")
        prompt.id = res["name"].split("/")[-1]
        return prompt

Saving prompts from Streamlit apps

The FeedbackCollector Streamlit integration inherits from the Trubrics object, meaning that you can log prompts in the same way directly from the FeedbackCollector. For more information on this, see the Streamlit integration docs.

Analyse prompts in Trubrics

Various filters allow AI teams to explore user prompts in Trubrics, and export them to csv.

[Beta] Ask an LLM about your prompts

Try asking an LLM a question about the user prompts in your dataset. For example, "What prompts ask for shark jokes?". This feature is in beta, so please give us feedback on the model generations!


Last update: November 15, 2023
Created: November 15, 2023