CLI Reference

opik

CLI tool for Opik.

opik [OPTIONS] COMMAND [ARGS]...

Options

-v, --version

Show the version and exit.

configure

Create a configuration file for the Opik Python SDK, if a configuration file already exists, it will be overwritten. This is also available as a function in the Python SDK.

opik configure [OPTIONS]

Options

--use_local, --use-local

Flag to configure the Opik Python SDK for local Opik deployments.

-y, --yes

Flag to automatically answer yes whenever a user approval might be required

export

Download data from a workspace or workspace/project to local files.

This command fetches traces, datasets, and prompts from the specified workspace or project and saves them to local JSON files in the output directory.

Note: Thread metadata is automatically derived from traces with the same thread_id, so threads don’t need to be exported separately.

WORKSPACE_OR_PROJECT: Either a workspace name (e.g., “my-workspace”) to export all projects,

or workspace/project (e.g., “my-workspace/my-project”) to export a specific project.

opik export [OPTIONS] WORKSPACE_OR_PROJECT

Options

-p, --path <path>

Directory to save exported data. Defaults to current directory.

--max-results <max_results>

Maximum number of items to download per data type. Defaults to 1000.

--filter <filter>

Filter string to narrow down the search using Opik Query Language (OQL).

--all

Include all data types (traces, datasets, prompts).

--include <include>

Data types to include in download. Can be specified multiple times. Defaults to traces only.

Options:

traces | datasets | prompts

--exclude <exclude>

Data types to exclude from download. Can be specified multiple times.

Options:

traces | datasets | prompts

--name <name>

Filter items by name using Python regex patterns. Matches against trace names, dataset names, or prompt names.

--debug

Enable debug output to show detailed information about the export process.

--trace-format <trace_format>

Format for exporting traces. Defaults to json.

Options:

json | csv

Arguments

WORKSPACE_OR_PROJECT

Required argument

healthcheck

Performs a health check of the application, including validation of configuration, verification of library installations, and checking the availability of the backend workspace. Prints all relevant information to assist in debugging and diagnostics.

opik healthcheck [OPTIONS]

Options

--show-installed-packages

Print the list of installed packages to the console.

import

Upload data from local files to a workspace or workspace/project.

This command reads data from JSON files in the specified workspace folder and imports them to the specified workspace or project.

Note: Thread metadata is automatically calculated from traces with the same thread_id, so threads don’t need to be imported separately.

WORKSPACE_FOLDER: Directory containing JSON files to import. WORKSPACE_NAME: Either a workspace name (e.g., “my-workspace”) to import to all projects,

or workspace/project (e.g., “my-workspace/my-project”) to import to a specific project.

opik import [OPTIONS] WORKSPACE_FOLDER WORKSPACE_NAME

Options

--dry-run

Show what would be imported without actually importing.

--all

Include all data types (traces, datasets, prompts).

--include <include>

Data types to include in upload. Can be specified multiple times. Defaults to traces only.

Options:

traces | datasets | prompts

--exclude <exclude>

Data types to exclude from upload. Can be specified multiple times.

Options:

traces | datasets | prompts

--name <name>

Filter items by name using Python regex patterns. Matches against trace names, dataset names, or prompt names.

--debug

Enable debug output to show detailed information about the import process.

Arguments

WORKSPACE_FOLDER

Required argument

WORKSPACE_NAME

Required argument

proxy

Start the Opik server.

opik proxy [OPTIONS]

Options

--ollama

Run as a proxy server for Ollama

--ollama-host <ollama_host>

Ollama server URL when using –ollama-proxy

Default:

'http://localhost:11434'

--lm-studio

Run as a proxy server for LM Studio

--lm-studio-host <lm_studio_host>

LM Studio server URL when using –lm-studio-proxy

Default:

'http://localhost:1234'

--host <host>

Host to bind to

Default:

'localhost'

--port <port>

Port to bind to

Default:

7860