Skip to content

Command line reference documentation

The utilities are described below.

comet init

You can use comet init to:

  • Create a Comet configuration file with your API key; OR
  • Create a new project directory with sample code based on a template

You may wish to do both in this order.

The first is used in this manner in the terminal:

$ comet init --api-key

You will be prompted for your Comet API key. You can also do this programmatically. For information on using comet_ml.init(), see Comet Quickstart.

The second is used to create a new project directory with a Python script and dependency file that shows how to incorporate Comet with various machine learning libraries. It is called in this way:

$ comet init

This usage of the comet init command is used to create example scripts using the cookiecutter recipe system. It currently supports creating example scripts in python and r that can be set using the --language flag (default is python).

For example, here is an example use creating a Keras example with confusion matrix, embedding visualizations, and histograms with the Comet Optimizer:

% comet init

Building Comet example script from recipe...
==================================================
Please answer the following questions:
project_slug [my_project]: my_project
Select online_or_offline:
1 - Online
2 - Offline
Choose from 1, 2 [1]: 1
Select framework:
1 - keras
Choose from 1 [1]: 1
Select confusion_matrix:
1 - Yes
2 - No
Choose from 1, 2 [1]: 1
Select histogram:
1 - Yes
2 - No
Choose from 1, 2 [1]: 1
Select embedding:
1 - Yes
2 - No
Choose from 1, 2 [1]: 1
Select optimizer:
1 - No
2 - Yes
Choose from 1, 2 [1]: 2

At this point there should now be an example script in my_project/comet-keras-example.py.

Comet continually adds additional example components to the recipe. If you have questions, or pull requests, you can make those at github.com/comet-ml/comet-recipes.

Optional arguments:

  -a, --api-key         Create a ~/.config.comet file with Comet API key
  -l LANGUAGE, --language LANGUAGE
                        The language of example script to generate
  -r, --replay          Replay the last comet init
  -f, --force           Force overwrite output directory if it exists
  -o OUTPUT, --output OUTPUT
                        Output directory for scripts to go to

comet check

The comet check command is used to check to see if your environment is set up properly to use Comet.

Usage:

comet check [-h] [--debug]

The simplest use is:

$ comet check
COMET INFO: ================================================================================
COMET INFO: Checking connectivity to server...
COMET INFO: ================================================================================
COMET INFO: Configured server address 'https://www.comet.com/clientlib/'
COMET INFO: Server address was configured in INI file '/home/user/.comet.config'
COMET INFO: Server connection is ok

COMET INFO: ================================================================================
COMET INFO: Checking connectivity to Rest API...
COMET INFO: ================================================================================
COMET INFO: Configured Rest API address 'https://www.comet.com/api/rest/v2/'
COMET INFO: Rest API address was configured in INI file '/home/user/.comet.config'
COMET INFO: REST API connection is ok

COMET INFO: ================================================================================
COMET INFO: Checking connectivity to Websocket Server
COMET INFO: ================================================================================
COMET WARNING: No WS address configured on client side, fallbacking on default WS address
'wss://www.comet.com/ws/logger-ws'.
If that's incorrect set the WS url through the `comet.ws_url_override` config key.
COMET INFO: Configured WS address 'wss://www.comet.com/ws/logger-ws'
COMET INFO: Websocket connection is ok

COMET INFO: ================================================================================
COMET INFO: Checking connectivity to Optimizer Server
COMET INFO: ================================================================================
COMET INFO: Configured Optimizer address 'https://www.comet.com/optimizer/'
COMET INFO: Optimizer address was configured in INI file '/home/user/.comet.config'
COMET INFO: Optimizer connection is ok

COMET INFO: Summary
COMET INFO: --------------------------------------------------------------------------------
COMET INFO: Server connectivity         True
COMET INFO: Rest API connectivity       True
COMET INFO: WS server connectivity      True
COMET INFO: Optimizer server connectivity   True

Running with the --debug flag will provide additional details. This is quite handy for tracking down issues, especially with a new environment, or on an on-prem installation.

comet upload

The comet upload utility is used for uploading OfflineExperiments to Comet. Consider the following command line:

$ comet upload /tmp/comet/5da271fcb60b4652a51dfc0decbe7cd9.zip

comet upload is installed when you installed comet_ml. If the command comet cannot be found, then you can try this more direct invocation using the same Python environment, so:

$ python -m comet_ml.scripts.comet_upload /tmp/comet/5da271fcb60b4652a51dfc0decbe7cd9.zip

Don’t forget to include your API Key and update the experiment path to the one displayed at the end of your OfflineExperiment script run.

To upload an offline experiment, you need to have configured your Comet API key. The recommended approach is to use the comet init command as described in the Configure Comet guide. You can also configure the API key using either an environment variable, or the Comet config file.

Sending multiple offline experiments is easy. To do so, execute the same comet upload command as before, but just replace the path to your experiment, so:

$ comet upload /path/to/*.zip

or

$ python -m comet_ml.scripts.comet_upload /path/to/*.zip

Debugging

If you encounter any bugs with either the OfflineExperiment class or uploading, run the uploader with the following:

$ COMET_LOGGING_FILE_LEVEL=debug \
    COMET_LOGGING_FILE=/tmp/comet.debug.log \
    COMET_API_KEY=MY_API_KEY \
    comet upload /path/to/experiments/*.zip

or

$ COMET_LOGGING_FILE_LEVEL=debug \
    COMET_LOGGING_FILE=/tmp/comet.debug.log \
    COMET_API_KEY=MY_API_KEY \
    python -m comet_ml.scripts.comet_upload /path/to/experiments/*.zip

The debug logs are sent to /tmp/comet.debug.log. This log will show details of all the steps in the process. If you still have problems, share this file with us using the Comet Slack channel.

comet optimize

The comet optimize is a utility for running the Comet optimizer in parallel or in serial. The format of the command line is:

$ comet optimize [options] [PYTHON_SCRIPT] OPTIMIZER

where OPTIMIZER is a JSON file, or an optimizer ID.

PYTHON_SCRIPT is a regular Python file that takes an optimizer config file, or optimizer ID. If PYTHON_SCRIPT is not included, then an optimizer is created and the optimizer ID is displayed.

Positional arguments:

  • PYTHON_SCRIPT - the name of the script to run.
  • OPTIMIZER - optimizer JSON file or optimizer ID.

Optional arguments:

  -j PARALLEL, --parallel PARALLEL
                        Number of parallel runs
  -t TRIALS, --trials TRIALS
                        Number of trials per parameter configuration
  -e EXECUTABLE, --executable EXECUTABLE
                        Run using an executable other than Python
  -d DUMP, --dump DUMP  Dump the parameters to given file name

Note that comet optimize requires having your COMET_API_KEY pre-configured in one of the many ways possible, for example in an environment variable, or in your .comet.config file.

Examples of calling comet optimize:

$ export COMET_API_KEY=<Your API Key>
$ export COMET_OPTIMIZER_ID=$(comet optimize opt.json)
$ comet optimize script.py opt.json
$ comet optimize -j 4 script.py opt.json

To use an executable other than Python, use -e, as follows:

$ comet optimize -e "run-on-cluster.sh" script.py opt.json

There are scenarios where you dedicate particular GPUs for particular processes (or similar logic). To that end, use the following environment variables:

  • COMET_OPTIMIZER_PROCESS_JOBS: Total number of parallel jobs (referred to as j)
  • COMET_OPTIMIZER_PROCESS_ID: Current job number (starting with 0 and up to, but not including, j)

For example, you could call your script as defined above:

$ comet optimize -j 4 script.py optimize.json

In the script, you can access COMET_OPTIMIZER_PROCESS_ID and COMET_OPTIMIZER_PROCESS_JOBS and use particular GPU configurations:

# script.py
import os

# setup as per above

process_id = os.environ["COMET_OPTIMIZER_PROCESS_ID"]
process_jobs = os.environ["COMET_OPTIMIZER_PROCESS_JOBS"]

# Handle process_id's 0 through process_jobs - 1
if process_id == 0:
    # handle j == 0
elif process_id == 1:
    # handle j == 1
elif process_id == 2:
    # handle j == 2
elif process_id == 3:
    # handle j == 3
For more details, see Comet environment variables.

comet python

The comet python utility is used to execute a Python script and import comet_ml automatically.

Although you still need to include import comet_ml in your script, you do not need to import comet_ml before your machine learning libraries anymore.

Usage:

comet python [-h] [-p PYTHON] [-m MODULE] python_script

Positional arguments:

  • python_script: the python script to launch

Optional arguments:

  -p PYTHON, --python PYTHON
                        Which Python interpreter to use
  -m MODULE, --module MODULE
                        Run library module as a script

comet offline

The comet offline utility is used to explore offline experiment archives.

Usage:

comet offline [-h] [--csv] [--section SECTION] [--level LEVEL]
                   [--name NAME] [--output OUTPUT] [--raw-size]
                   [--no-header]
                   archives [archives ...]

This command line displays summaries of an offline experiments:

$ comet offline *.zip

You may also display the ZIP details in a CSV (Comma-Separated Value) format. This format shows an experiment's data in a row format in the following order:

  • Workspace
  • Project
  • Experiment
  • Level
  • Section
  • Name
  • Value

where:

  • Workspace: the name of a specific workspace, or DEFAULT.
  • Project: the name of a specific project, or "general".
  • Experiment: the experiment key for this experiment.
  • Level: detail, maximum, or minimum.
  • Section: metric, param, log_other, etc.
  • Name: name of metric, param, etc.
$ comet offline --csv *.zip

You may use the optional flags --level, --section, or --name to filter the rows. That is, if you use this command line:

$ comet offline --level detail *.zip

Note that when you use --level, --section, or --name then that implies --csv.

Positional arguments:

  • archives: the offline experiment archives to display

Optional arguments:

  --csv              Output details in csv format
  --section SECTION  Output specific section in csv format, including param,
                     metric, log_other, data, etc.
  --level LEVEL      Output specific summary level in csv format, including
                     minimum, maximum, detail
  --name NAME        Output specific name in csv format, including items like
                     loss, acc, etc.
  --output OUTPUT    Output filename for csv format
  --raw-size         Use bytes for file sizes
  --no-header        Use this flag to suppress CSV header

comet models

The comet models command is used to list and download a registered model to your local file system.

Usage:

comet models download [-h]
   --workspace WORKSPACE
   --model-name MODEL_NAME
   (--model-version MODEL_VERSION | --model-stage MODEL_STAGE)
   [--output OUTPUT]

or

comet models list [-h] --workspace WORKSPACE

For downloading a model, you must provide the name of the workspace and the registered model name. You must also provide a specific version or stage.

For example, to download a registry model named "My Model" from the workspace "My Workspace" at version 1.0.0, you can run:

$ comet models download \
    --workspace "My Workspace" \
    --model-name "My Model" \
    --model-version "1.0.0"

The registry model files will be downloaded to a directory named "model". You can choose a different output directory by using the "--output" flag.

Optional arguments:

  -w WORKSPACE, --workspace WORKSPACE
                        The workspace name of the registry model to download
  --model-name MODEL_NAME
                        The name of the registry model to download
  --model-version MODEL_VERSION
                        The semantic version of the registry model to download
                        (for example: 1.0.0)
  --model-stage MODEL_STAGE
                        The stage of the registry model to download (for
                        example: production)
  --output OUTPUT       The output directory where to download the model,
                        default to `model`
Apr. 29, 2024