Overview
The Comet platform has extensive LLMOps functionality powered by a specialized SDK, this SDK is refered to as the LLM SDK and is open-sourced at comet-llm.
For more information on LLM support in Comet, see Track LLMs.
Note
The LLM SDK is under active development. If there are any features you would like to see implemented, reach out on Github
Install LLM SDK¶
To use the LLM SDK, downoad and install using:
pip install comet-llm
The full reference documentation is available at LLM SDK reference
Use the LLM SDK to log prompts and responses¶
The LLM SDK supports logging prompts with it's associated response as well as any associated metadata like token usage. This can be achieved through the function log_prompt
:
import comet_llm
comet_llm.log_prompt(
prompt="Answer the question and if the question can't be answered, say \"I don't know\"\n\n---\n\nQuestion: What is your name?\nAnswer:",
prompt_template="Answer the question and if the question can't be answered, say \"I don't know\"\n\n---\n\nQuestion: {{question}}?\nAnswer:",
prompt_template_variables={"question": "What is your name?"},
metadata= {
"usage.prompt_tokens": 7,
"usage.completion_tokens": 5,
"usage.total_tokens": 12,
},
output=" My name is Alex.",
duration=16.598,
)
Upcoming features¶
The LLM SDK is under active development, we are currently planning on implementing:
- Support for logging LLM chains
- Support for logging prompt and response embeddings
Feel free to reach out on Github with any and all feature requests !