Novita AI

Novita AI is an AI cloud platform that helps developers easily deploy AI models through a simple API, backed by affordable and reliable GPU cloud infrastructure. It provides access to a wide range of models including DeepSeek, Qwen, Llama, and other popular LLMs.

This guide explains how to integrate Opik with Novita AI via LiteLLM. By using the LiteLLM integration provided by Opik, you can easily track and evaluate your Novita AI API calls within your Opik projects as Opik will automatically log the input prompt, model used, token usage, and response generated.

Getting Started

Configuring Opik

To get started, you need to configure Opik to send traces to your Comet project. You can do this by setting the OPIK_PROJECT_NAME environment variable:

$export OPIK_PROJECT_NAME="your-project-name"
>export OPIK_WORKSPACE="your-workspace-name"

You can also call the opik.configure method:

1import opik
2
3opik.configure(
4 project_name="your-project-name",
5 workspace="your-workspace-name",
6)

Configuring LiteLLM

Install the required packages:

$pip install opik litellm

Create a LiteLLM configuration file (e.g., litellm_config.yaml):

1model_list:
2 - model_name: deepseek-r1-turbo
3 litellm_params:
4 model: novita/deepseek/deepseek-r1-turbo
5 api_key: os.environ/NOVITA_API_KEY
6 - model_name: qwen-32b-fp8
7 litellm_params:
8 model: novita/qwen/qwen3-32b-fp8
9 api_key: os.environ/NOVITA_API_KEY
10 - model_name: llama-70b-instruct
11 litellm_params:
12 model: novita/meta-llama/llama-3.1-70b-instruct
13 api_key: os.environ/NOVITA_API_KEY
14
15litellm_settings:
16 callbacks: ["opik"]

Authentication

Set your Novita AI API key as an environment variable:

$export NOVITA_API_KEY="your-novita-api-key"

You can obtain a Novita AI API key from the Novita AI dashboard.

Usage

Using LiteLLM Proxy Server

Start the LiteLLM proxy server:

$litellm --config litellm_config.yaml

Use the proxy server to make requests:

1import openai
2
3client = openai.OpenAI(
4 api_key="anything", # can be anything
5 base_url="http://0.0.0.0:4000"
6)
7
8response = client.chat.completions.create(
9 model="deepseek-r1-turbo",
10 messages=[
11 {"role": "user", "content": "What are the advantages of using cloud-based AI platforms?"}
12 ]
13)
14
15print(response.choices[0].message.content)

Direct Integration

You can also use LiteLLM directly in your Python code:

1import os
2from litellm import completion
3
4# Configure Opik
5import opik
6opik.configure()
7
8# Configure LiteLLM for Opik
9from litellm.integrations.opik.opik import OpikLogger
10import litellm
11
12litellm.callbacks = ["opik"]
13
14os.environ["NOVITA_API_KEY"] = "your-novita-api-key"
15
16response = completion(
17 model="novita/deepseek/deepseek-r1-turbo",
18 messages=[
19 {"role": "user", "content": "How can cloud AI platforms improve development efficiency?"}
20 ]
21)
22
23print(response.choices[0].message.content)

Supported Models

Novita AI provides access to a comprehensive catalog of models from leading providers. Some of the popular models available include:

  • DeepSeek Models: deepseek-r1-turbo, deepseek-v3-turbo, deepseek-v3-0324
  • Qwen Models: qwen3-235b-a22b-fp8, qwen3-30b-a3b-fp8, qwen3-32b-fp8
  • Llama Models: llama-4-maverick-17b-128e-instruct-fp8, llama-3.3-70b-instruct, llama-3.1-70b-instruct
  • Mistral Models: mistral-nemo
  • Google Models: gemma-3-27b-it

For the complete list of available models, visit the Novita AI model catalog.

Advanced Features

Tool Calling

Novita AI supports function calling with compatible models:

1from litellm import completion
2
3tools = [
4 {
5 "type": "function",
6 "function": {
7 "name": "get_current_weather",
8 "description": "Get the current weather in a given location",
9 "parameters": {
10 "type": "object",
11 "properties": {
12 "location": {
13 "type": "string",
14 "description": "The city and state, e.g. San Francisco, CA",
15 },
16 "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
17 },
18 "required": ["location"],
19 },
20 },
21 }
22]
23
24response = completion(
25 model="novita/deepseek/deepseek-r1-turbo",
26 messages=[{"role": "user", "content": "What's the weather like in Boston today?"}],
27 tools=tools,
28)

JSON Mode

For structured outputs, you can enable JSON mode:

1response = completion(
2 model="novita/deepseek/deepseek-r1-turbo",
3 messages=[
4 {"role": "user", "content": "List 5 popular cookie recipes."}
5 ],
6 response_format={"type": "json_object"}
7)

Feedback Scores and Evaluation

Once your Novita AI calls are logged with Opik, you can evaluate your LLM application using Opik’s evaluation framework:

1from opik.evaluation import evaluate
2from opik.evaluation.metrics import Hallucination
3
4# Define your evaluation task
5def evaluation_task(x):
6 return {
7 "message": x["message"],
8 "output": x["output"],
9 "reference": x["reference"]
10 }
11
12# Create the Hallucination metric
13hallucination_metric = Hallucination()
14
15# Run the evaluation
16evaluation_results = evaluate(
17 experiment_name="novita-ai-evaluation",
18 dataset=your_dataset,
19 task=evaluation_task,
20 scoring_metrics=[hallucination_metric],
21)

Environment Variables

Make sure to set the following environment variables:

$# Novita AI Configuration
>export NOVITA_API_KEY="your-novita-api-key"
>
># Opik Configuration
>export OPIK_PROJECT_NAME="your-project-name"
>export OPIK_WORKSPACE="your-workspace-name"