Skip to content

Overview

The Comet platform has extensive LLMOps functionality powered by a specialized SDK, this SDK is refered to as the LLM SDK and is open-sourced at comet-llm.

For more information on LLM support in Comet, see Track LLMs.

Note

The LLM SDK is under active development. If there are any features you would like to see implemented, reach out on Github

Install LLM SDK

To use the LLM SDK, download and install using:

pip install comet-llm

In order to configure the LLM SDK to start logging data, we recommend calling the comet_llm.init() method:

1
2
3
import comet_llm

comet_llm.init()

The full reference documentation is available at LLM SDK reference or check out the quickstart guide.

Log prompts and chains

The main methods for logging prompts and chains are:

  1. comet_llm.log_prompt

    1
    2
    3
    4
    5
    6
    7
    8
    import comet_llm
    
    comet_llm.init()
    
    comet_llm.log_prompt(
        prompt="Describe a clumsy robot",
        output="The robot's attempts at grace resembled a drunken giraffe on roller skates."
    )
    
    2. comet_llm.start_chain
     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    import comet_llm
    import time
    
    comet_llm.init()
    
    chain = comet_llm.start_chain({"user_question": "How many stars in our galaxy?"})
    
    with comet_llm.Span(
        inputs={"question": "What is our galaxy"},
        category="LLM"
    ) as span:
        time.sleep(1.32)
        span.set_outputs(outputs={"response": "Milky Way"})
    
    with comet_llm.Span(
        inputs={"question": "How many stars in the Milky Way ?"},
        category="LLM"
    ) as span:
        time.sleep(0.89)
        span.set_outputs(outputs={"response": "100 billion"})
    
    comet_llm.end_chain(outputs={"response": "There are 100 billion stars in our galaxy"})
    

You can learn more about logging prompts and chains in the dedicate guide: Log prompts and chains

Upcoming features

The LLM SDK is under active development, we are currently planning on implementing:

  • Support for logging prompt and response embeddings

Feel free to reach out on Github with any and all feature requests !

Apr. 25, 2024