-
Building a Low-Cost Local LLM Server to Run 70 Billion Parameter Models
A guest post from Fabrício Ceolin, DevOps Engineer at Comet. Inspired by the growing demand for large-scale language models, Fabrício…
Run open source LLM evaluations with Opik!
StarA guest post from Fabrício Ceolin, DevOps Engineer at Comet. Inspired by the growing demand for large-scale language models, Fabrício…
Trusted by Thousands of Data Scientists