Skip to content

Comet LLM releases

The following is a history of released comet_llm versions. It does not list everything that was changed in a release, but does mention the highlights and all public-facing additions, changes, and deprecations.

You can install any one of the following released version numbers. The full list of releases is at Python Package Index.

Release 2.1.1

Release date: Jan 23, 2024

  • Introduced a new query variable in the comet_llm.API to query traces by user feedback
  • Introduced minimum version for the comet_ml package dependency

Release 2.1.0

Release date: Dec 15, 2023

  • Introduced the comet_llm.API class to query and edit LLM traces

Release 2.0.2

Release date: Dec 4, 2023

  • TLS verification can now be disabled using the COMET_INTERNAL_CHECK_TLS_CERTIFICATE environment variable
  • Fixed exception raised when patching the OpenAI library for some scenarios

Release 2.0.1

Release date: Nov 30, 2023

  • Fixed issue that lead some ssl certificates to be ignored

Release 1.6.0

Release date: Nov 9, 2023

  • Added support for openai v1

Release 1.5.0

Release date: Nov 6, 2023

  • Introduced new logic and config variable(COMET_RAISE_EXCEPTIONS_OR_ERROR) for errors related to API key.

Release 1.4.1

Release date: Oct 6, 2023

  • Fixed issue with packaging that meant version 1.4.0 couldn't be imported

Release 1.4.0

Release date: Oct 6, 2023

note: This version had a packaging issue and can't be imported, please use version 1.4.1 instead

  • Added OpenAI auto-logger
  • Added debug logging mode
  • Now an error is logged if trying to log to a non-LLM project

Release 1.3.0

Release date: Sep 7, 2023

  • Introduced comet_llm.is_ready() that can signalize if everything is ready for starting logging data
  • Introduced disabled mode for comet-llm api. In this mode all calls to prompt- and chain-logging API do nothing.

Release 1.2.0

Release date: Sep 1, 2023

  • Added multi-thread support for chains. Now every thread might have it's own "global" chain. It means that now start_chain , end_chain and Span calls refer to different chains if called in different threads. It makes it possible to execute multiple chains in parallel out of the box (but 1 per thread) without problems with chain consistency.
  • Added examples folder with jupyter notebook and readme files.
Feb. 24, 2024