Prompt management
Opik provides a prompt library that you can use to manage your prompts. Storing prompts in a library allows you to version them, reuse them across projects, and manage them in a central location.
Using a prompt library does not mean you can’t store your prompt in code, we have designed the prompt library to work seamlessly with your existing prompt files while providing the benefits of a central prompt library.
Opik supports two types of prompts:
- Text Prompts: Simple string-based prompts with variable substitution
- Chat Prompts: Structured message-based prompts in OpenAI format for conversational AI applications, supporting multimodal content (text, images, videos)
Text Prompts
Text prompts are simple string-based templates that support variable substitution. They are ideal for single-turn interactions or when you need to generate a single piece of text.
Managing text prompts stored in code
The recommended way to create and manage text prompts is using the
Prompt
object. This will allow you to continue versioning your prompts in code while
also getting the benefit of having prompt versions managed in the Opik platform
so you can more easily keep track of your progress.
Prompts stored in code
Prompts stored in a file
The prompt will now be stored in the library and versioned:

The Prompt
object will create a new prompt in the library if this prompt doesn’t already exist,
otherwise it will return the existing prompt.
This means you can safely run the above code multiple times without creating duplicate prompts.
Using the low level SDK for text prompts
If you would rather keep text prompts in the Opik platform and manually update / download them, you can use the low-level Python SDK to manage your prompts.
When to use client methods vs. classes:
-
Use
Prompt()class (recommended): For most use cases, this class automatically uses the global Opik configuration set byopik.configure(). -
Use
client.create_prompt(): When you need to use a specific client configuration that differs from the global configuration (e.g., different workspace, host, or API key).
Creating text prompts
You can create a new prompt in the library using both the SDK and the UI:
Using the Python SDK
Using the UI
Adding prompts to traces and spans
You can associate prompts with your traces and spans using the opik_context module. This is useful when you want to track which prompts were used during the execution of your functions:
Adding prompts to traces
Adding prompts to spans
Combined usage
You can view the prompts associated with a trace or span in the Opik UI:

Further details on using prompts from the Prompt library are provided in the following sections.
Using prompts in supported integrations
Prompts can be used in all supported third-party integrations by attaching them to traces and spans through the opik_context module.
For instance, you can use prompts with the Google ADK integration, as shown in the example here.
Downloading your text prompts
Once a prompt is created in the library, you can download it in code using the Opik.get_prompt method:
If you are not using the SDK, you can download a prompt by using the REST API.
Searching prompts
To discover prompts by name substring and/or filters, use search_prompts. Filters are written in Opik Query Language (OQL):
You can filter by template_structure to search for only text prompts ("text") or only chat prompts ("chat"). Without the filter, search_prompts returns both types.
The filter_string parameter uses Opik Query Language (OQL) with the format:
"<COLUMN> <OPERATOR> <VALUE> [AND <COLUMN> <OPERATOR> <VALUE>]*"
Supported columns for prompts:
Examples:
tags contains "production"- Filter by tagname contains "summary"- Filter by name substringcreated_by = "user@example.com"- Filter by creatortags contains "alpha" AND tags contains "beta"- Multiple tag filteringtemplate_structure = "text"- Filter for only text promptstemplate_structure = "chat"- Filter for only chat prompts
search_prompts returns the latest version for each matching prompt.
Chat Prompts
Chat prompts are structured message-based templates designed for conversational AI applications. They support multiple message roles (system, user, assistant) and multimodal content including text, images, and videos.
Key Features
- Structured Messages: Organize prompts as a list of messages with roles (system, user, assistant)
- Multimodal Support: Include images, videos, and text in the same prompt
- Variable Substitution: Use Mustache (
{{variable}}) or Jinja2 syntax - Version Control: Automatic versioning when messages change
- Template Validation: Optional validation of template placeholders
Managing chat prompts stored in code
Similar to text prompts, you can create and manage chat prompts using the
ChatPrompt
class. This allows you to version your chat prompts in code while benefiting from
centralized management in the Opik platform.
Simple chat prompt
With OpenAI
Multi-turn conversation
The chat prompt will now be stored in the library and versioned, just like text prompts.
The ChatPrompt
class will create a new chat prompt in the library if it doesn’t already exist,
otherwise it will return the existing prompt.
This means you can safely run the code multiple times without creating duplicate prompts.
Once created, you can view and manage your chat prompts in the Opik UI:

Multimodal chat prompts
Chat prompts support multimodal content, allowing you to include images and videos alongside text. This is useful for vision-enabled models.
Image analysis
Video analysis
Mixed content
When formatting multimodal prompts, you can specify supported_modalities to control how content is rendered:
- If a modality is supported (e.g.,
{"vision": True}), the structured content is preserved - If a modality is not supported, it’s replaced with text placeholders (e.g.,
<<>>)
This allows you to use the same prompt template with different models that may or may not support certain modalities.
Using the low level SDK for chat prompts
You can also use the low-level Python SDK to create and manage chat prompts directly.
When to use client methods vs. classes:
-
Use
ChatPrompt()orPrompt()classes (recommended): For most use cases, these classes automatically use the global Opik configuration set byopik.configure(). -
Use
client.create_chat_prompt()orclient.create_prompt(): When you need to use a specific client configuration that differs from the global configuration (e.g., different workspace, host, or API key).
Creating chat prompts
Using the Python SDK
Using the UI
Use Opik.create_chat_prompt to create a chat prompt:
Downloading chat prompts
Once a chat prompt is created in the library, you can download it in code using the Opik.get_chat_prompt method:
Searching chat prompts
You can search for chat prompts specifically by using the template_structure filter with Opik.search_prompts:
To search for text prompts only, use template_structure = "text". Without the filter, search_prompts returns both text and chat prompts.
The filter_string parameter uses Opik Query Language (OQL) and supports the same columns and operators as text prompts (see Searching prompts above).
Template types for chat prompts
Chat prompts support two template types for variable substitution, specified using PromptType:
Mustache (default)
Jinja2
Jinja2 templates support advanced features like conditionals, loops, and filters, making them more powerful for complex prompt logic. However, Mustache templates are simpler and more portable.
Chat prompt versioning
Chat prompts are automatically versioned when the messages change. Each version has a unique commit hash.
You can use get_chat_prompt to retrieve a specific version by commit hash, and get_chat_prompt_history to get all versions.
If you create a chat prompt with identical messages to an existing version, Opik will return the existing version instead of creating a duplicate. This helps avoid unnecessary version proliferation.
Prompt structure immutability
Once you create a prompt with a specific structure (text or chat), that structure cannot be changed. This ensures consistency and prevents accidental mixing of prompt types.
Similarly, if you create a text prompt first, you cannot later create a chat prompt with the same name.
Both Prompt and ChatPrompt classes will raise a PromptTemplateStructureMismatch exception if you attempt to change the structure of an existing prompt.
Working with prompt versions
Viewing prompt history (all versions)
Text prompts
Chat prompts
Use get_prompt_history for text prompts:
This returns a list of Prompt objects (each representing a specific version) for the given prompt name.
You can use this information to:
- Audit changes to understand how prompts evolved
- Identify the best performing version by linking commit IDs to experiment results
- Document prompt changes for compliance or review purposes
- Retrieve specific versions by commit ID for testing or rollback
Accessing specific prompt versions
Text prompts
Chat prompts
Use the commit parameter with get_prompt:
The commit parameter accepts the commit ID (also called commit hash) of the specific prompt version you want to retrieve. You can find commit IDs in the prompt history in the Opik UI or by using the get_prompt_history or get_chat_prompt_history methods (see above).
This is particularly useful when you want to:
- Pin to a specific version in production to ensure consistent behavior
- Test different versions side by side in experiments
- Roll back to a previous version if issues are discovered
- Compare results across different prompt versions
Using prompts in experiments
Linking prompts to experiments
Text prompt
Chat prompt
The experiment will now be linked to the prompt, allowing you to view all experiments that use a specific prompt:

Comparing prompt versions in experiments
Text prompts
Chat prompts
This workflow allows you to systematically test and compare different prompt versions to identify the most effective one for your use case.