skip to Main Content
Comet Launches Kangas, an Open Source Data Analysis, Exploration and Debugging Tool for Machine Learning.

Less friction,
more ML

Comet’s machine learning platform integrates with your existing infrastructure and tools so you can manage, visualize, and optimize models—from training runs to production monitoring.

Trusted by the most innovative ML teams

Affirm Logo
Ancestry Logo
Assembly AI Logo
CEPSA Logo
Etsy Logo
Shopify Logo
Uber Logo
Zappos Logo

Trusted by the most innovative ML teams

Monitor and manage models, from small teams to massive scale

Add two lines of code to your notebook or script and automatically start tracking code, hyperparameters, metrics, and more, so you can compare and reproduce training runs.

Experiment Management
PythonJavaR
1 from comet_ml import Experiment
2 
3 # Initialize the Comet logger
4 experiment = Experiment()

Comet’s ML platform gives you visibility into training runs and models so you can iterate faster.

Experiment Management

In addition to the 30+ built-in visualizations Comet provides, you can code your own visualizations using Plotly and Matplotlib.

Knowing what data was used to train a model is a key part of the MLOps lifecycle. Comet Artifacts allows you to track data by uploading directly to Comet’s machine learning platform or by storing a reference to it.

Comet Artifacts

Comet Model Registry allows you to keep track of your models ready for deployment. Thanks to the tight integration with Comet Experiment Management, you will have full lineage from training to production.

Comet Model Registry

The performance of models deployed to production degrade over time, either due to drift or data quality. Use Comet’s machine learning platform to identify drift and track accuracy metrics using baselines automatically pulled from training runs.

Comet Model Production Monitoring

Easy Integration

Add two lines of code to your notebook or script and automatically start tracking code, hyperparameters, metrics, and more.

Try a Live Notebook
from comet_ml import Experiment
import torch.nn as nn

# 1. Define a new experiment 
experiment = Experiment(project_name="YOUR PROJECT")

# 2. Create your model class 
class RNN(nn.Module):
    #... Define your Class 

# 3. Train and test your model while logging everything to Comet
with experiment.train():
    # ...Train your model and log metrics 
    experiment.log_metric("accuracy", correct / total, step = step)

# 4. View real-time metrics in Comet
from pytorch_lightning.loggers import CometLogger

# 1. Create your Model

# 2. Initialize CometLogger
comet_logger = CometLogger()

# 3. Train your model 
trainer = pl.Trainer(
    logger=[comet_logger],
    # ...configs
)

trainer.fit(model)

# 4. View real-time metrics in Comet
from comet_ml import Experiment
from transformers import Trainer

# 1. Define a new experiment 
experiment = Experiment(project_name="YOUR PROJECT")

# 2. Train your model 
trainer = Trainer(
    model = model,
    # ...configs
)

trainer.train()

# 3. View real-time metrics in Comet
from comet_ml import Experiment
from tensorflow import keras

# 1. Define a new experiment 
experiment = Experiment(project_name="YOUR PROJECT")

# 2. Define your model
model = tf.keras.Model(
    # ...configs
)

# 3. Train your model
model.fit(
    x_train, y_train,
    validation_data=(x_test, y_test),
)

# 4. Track real-time metrics in Comet
from comet_ml import Experiment
import tensorflow as tf

# 1. Define a new experiment 
experiment = Experiment(project_name="YOUR PROJECT")

# 2. Define and train your model
model.fit(...)

# 3. Log additional model metrics and params
experiment.log_parameters({'custom_params': True})
experiment.log_metric('custom_metric', 0.95)

# 4. Track real-time metrics in Comet
from comet_ml import Experiment
import tree from sklearn

# 1. Define a new experiment 
experiment = Experiment(project_name="YOUR PROJECT")

# 2. Build your model and fit
clf = tree.DecisionTreeClassifier(
    # ...configs
)

clf.fit(X_train_scaled, y_train)
params = {...}
metrics = {...}

# 3. Log additional metrics and params
experiment.log_parameters(params)
experiment.log_metrics(metrics)

# 4. Track model performance in Comet
from comet_ml import Experiment
import xgboost as xgb

# 1. Define a new experiment
experiment = Experiment(project_name="YOUR PROJECT")

# 2. Define your model and fit
xg_reg = xgb.XGBRegressor(
    # ...configs
)
xg_reg.fit(
    X_train,
    y_train,
    eval_set=[(X_train, y_train), (X_test, y_test)],
    eval_metric="rmse",
)

# 3. Track model performance in Comet
# Utilize Comet in any environment
from comet_ml import Experiment

# 1. Define a new experiment
experiment = Experiment(project_name="YOUR PROJECT")

# 2. Model training here

# 3. Log metrics or params over time
experiment.log_metrics(metrics)

#4. Track real-time metrics in Comet
# Utilize Comet in any environment
from comet_mpm import CometMPM

# 1. Create the MPM logger
MPM = CometMPM()

# 2. Add your inference logic here

# 3. Log metrics or params over time
MPM.log_event(
        prediction_id="...",
        input_features=input_features,
        output_value=prediction,
        output_probability=probability,
    )

Experiment Management

Pytorch

from comet_ml import Experiment
import torch.nn as nn

# 1. Define a new experiment 
experiment = Experiment(project_name="YOUR PROJECT")

# 2. Create your model class 
class RNN(nn.Module):
    #... Define your Class 

# 3. Train and test your model while logging everything to Comet
with experiment.train():
    # ...Train your model and log metrics 
    experiment.log_metric("accuracy", correct / total, step = step)

# 4. View real-time metrics in Comet

Pytorch Lightning

from pytorch_lightning.loggers import CometLogger

# 1. Create your Model

# 2. Initialize CometLogger
comet_logger = CometLogger()

# 3. Train your model 
trainer = pl.Trainer(
    logger=[comet_logger],
    # ...configs
)

trainer.fit(model)

# 4. View real-time metrics in Comet

Hugging Face

from comet_ml import Experiment
from transformers import Trainer

# 1. Define a new experiment 
experiment = Experiment(project_name="YOUR PROJECT")

# 2. Train your model 
trainer = Trainer(
    model = model,
    # ...configs
)

trainer.train()

# 3. View real-time metrics in Comet

Keras

from comet_ml import Experiment
from tensorflow import keras

# 1. Define a new experiment 
experiment = Experiment(project_name="YOUR PROJECT")

# 2. Define your model
model = tf.keras.Model(
    # ...configs
)

# 3. Train your model
model.fit(
    x_train, y_train,
    validation_data=(x_test, y_test),
)

# 4. Track real-time metrics in Comet

Tensorflow

from comet_ml import Experiment
import tensorflow as tf

# 1. Define a new experiment 
experiment = Experiment(project_name="YOUR PROJECT")

# 2. Define and train your model
model.fit(...)

# 3. Log additional model metrics and params
experiment.log_parameters({'custom_params': True})
experiment.log_metric('custom_metric', 0.95)

# 4. Track real-time metrics in Comet

Scikit-learn

from comet_ml import Experiment
import tree from sklearn

# 1. Define a new experiment 
experiment = Experiment(project_name="YOUR PROJECT")

# 2. Build your model and fit
clf = tree.DecisionTreeClassifier(
    # ...configs
)

clf.fit(X_train_scaled, y_train)
params = {...}
metrics = {...}

# 3. Log additional metrics and params
experiment.log_parameters(params)
experiment.log_metrics(metrics)

# 4. Track model performance in Comet

XGBoost

from comet_ml import Experiment
import xgboost as xgb

# 1. Define a new experiment
experiment = Experiment(project_name="YOUR PROJECT")

# 2. Define your model and fit
xg_reg = xgb.XGBRegressor(
    # ...configs
)
xg_reg.fit(
    X_train,
    y_train,
    eval_set=[(X_train, y_train), (X_test, y_test)],
    eval_metric="rmse",
)

# 3. Track model performance in Comet

Any framework

# Utilize Comet in any environment
from comet_ml import Experiment

# 1. Define a new experiment
experiment = Experiment(project_name="YOUR PROJECT")

# 2. Model training here

# 3. Log metrics or params over time
experiment.log_metrics(metrics)

#4. Track real-time metrics in Comet

Model Monitoring

Any Framework

# Utilize Comet in any environment
from comet_mpm import CometMPM

# 1. Create the MPM logger
MPM = CometMPM()

# 2. Add your inference logic here

# 3. Log metrics or params over time
MPM.log_event(
        prediction_id="...",
        input_features=input_features,
        output_value=prediction,
        output_probability=probability,
    )

An Extensible, Fully Customizable Machine Learning Platform

Comet’s ML platform supports productivity, reproducibility, and collaboration, no matter what tools you use to train and deploy models: managed, open source, or in-house. Use Comet’s platform on cloud, virtual private cloud (VPC), or on-premises.

Manage and version your training data, track and compare training runs, create a model registry, and monitor your models in production—all in one platform.

Machine Learning Platform | Comet ML

Move ML Forward—Your Way

Run Comet’s ML platform on any infrastructure. Bring your existing software and data stack. Use code panels to create visualizations in your preferred user interfaces.

Infrastructure

aws-logo-grey
google-cloud-logo-grey
imb-cloud-logo-grey
microsoft-azure-logo-grey
on-premise-logo-grey

An ML Platform Built for Enterprise, Driven by Community

Comet’s ML platform is trusted by innovative data scientists, ML practitioners, and engineers in the most demanding enterprise environments.

Enterprise User

"Comet has aided our success with ML and serves to further ML development within Zappos.”
10%
reduction in order returns due to size
Kyle Anderson
Director of Software Engineering

Enterprise User

"Comet offers the most complete experiment tracking solution on the market. It’s brought significant value to our business."
Service for millions of customers
Olcay Cirit
Staff Research and Tech Lead

Community User

“Comet enables us to speed up research cycles and reliably reproduce and collaborate on our modeling projects. It has become an indispensable part of our ML workflow.”
Developing
NLP tools for thousands of researchers
Victor Sanh
Machine Learning Scientist

Community User

"None of the other products have the simplicity, ease of use and feature set that Comet has."
Developing speech and language algorithms
Ronny Huang
Research Scientist

Enterprise User

"After discovering Comet, our deep learning team’s productivity went up. Comet is easy to set up and allows us to move research faster."
Building
speech recognition with deep learning
Guru Rao
Head of AI

Enterprise User

"We can seamlessly compare and share experiments, debug and stop underperforming models. Comet has improved our efficiency."
Pioneering family history research
Carol Anderson
Staff Data Scientist
Back To Top