August 30, 2024
A guest post from Fabrício Ceolin, DevOps Engineer at Comet. Inspired by the growing demand…
Today, we’re thrilled to introduce Opik – an open-source, end-to-end LLM development platform that provides the observability tools you need to confidently evaluate, test, and monitor your LLM applications across development and production.
Opik was created to address the unique challenges of LLM-based development:
We’ve always been closely connected to AI developers, working to streamline how teams build and productionize machine learning models. Our MLOps tools like Experiment Management and Model Production Monitoring help close the loop in the model development workflow. But our commitment to the community goes beyond our products – we’ve also been passionate about supporting the open-source community by open-sourcing portions of our platform such as Kangas, our tool for ML analysis and visualization.
Opik is a natural next step for us. We hope it enables Comet users and the AI community at large to get more out of LLMs. We named it after Ernst Opik, an Estonian astronomer who was at the forefront of the study of comets and solar system dynamics. With so many of today’s great discoveries happening at the forefront of AI, we hope you’ll find inspiration in Opik’s story.
Install Opik with just a few lines of code – whether you’re self-hosting or using Comet’s cloud platform, Opik is built to fit seamlessly into your existing stack. Opik is compatible with any LLM you like, and supports direct integrations with OpenAI, LangChain, LlamaIndex, Predibase, Ragas, promptfoo, LiteLLM, and Pinecone out of the box.
Opik’s full LLM evaluation feature set is free to use, with a highly scalable and industry compliant version available for enterprise teams. Sign up for free, check out the documentation, and log your first LLM trace today.
As an open-source project, we’re eager to see how the community shapes Opik’s future. We invite you to contribute bug reports, feature requests, or improvements to the documentation – or join our discussions on Slack and Github to leave feedback and help shape the roadmap.