Skip to content

Quickstart

The Comet Optimizer is a powerful intuitive tool in your automated hyperparameter tuning toolbox.

Use Optimizer to dynamically find the best set of hyperparameter values that will minimize or maximize a particular metric. It can make suggestions for what hyperparameter values to try next, either in serial or in parallel (or a combination).

In its simplest form, you can use the hyperparameter search this way:

# file: example-1.py

from comet_ml import Optimizer

# You need to specify the algorithm and hyperparameters to use:
config = {
    # Pick the Bayes algorithm:
    "algorithm": "bayes",

    # Declare your hyperparameters:
    "parameters": {
        "x": {"type": "integer", "min": 1, "max": 5},
    },

    # Declare what to optimize, and how:
    "spec": {
        "metric": "loss",
        "objective": "minimize",
    },
}

# Next, create an optimizer, passing in the configuration:
opt = Optimizer(config)

# define fit function here!

# Finally, get experiments, and train your models:
for experiment in opt.get_experiments(
        project_name="optimizer-search-01"):
    # Test the model
    loss = fit(experiment.get_parameter("x"))
    experiment.log_metric("loss", loss)

That's it! Comet will provide you with an Experiment object already set up with the suggested parameters to try. You merely need to train the model and log the metric to optimize ("loss" in this case).

See the Optimizer class for more details on creating an optimizer.

Optimizer configuration

Optimizer Configuration is performed through a dictionary, either specified in code, or in a config file. The dictionary format is a JSON structure similar to the following:

{
    "algorithm": "bayes",
    "spec": {
        "maxCombo": 0,
        "objective": "minimize",
        "metric": "loss",
        "minSampleSize": 100,
        "retryLimit": 20,
        "retryAssignLimit": 0,
    },
    "parameters": {
        "hidden-layer-size": {"type": "integer", "min": 5, "max": 100},
        "hidden2-layer-size": {"type": "discrete", "values": [16, 32, 64]},
    },
    "name": "My Bayesian Search",
    "trials": 1,
}

As shown, the Optimizer configuration dictionary has five sections:

AlgorithmDescription
algorithmString, indicating the search algorithm to use
specDictionary, defining the algorithm-specific specifications
parametersDictionary, defining the parameter distribution space
name(Optional) String, specifying a personalizable name to associate with this search instance
trials(Optional) Integer, specifying the number of trials to run per experiment. Defaults to 1.

You can find a full description of each mandatory sections in the Optimizer Configuration page.

Running the Comet Optimizer in parallel

When tuning hyper-parameters it is common to want to speed up the time it takes to run the search by parallelizing the search. This can easily be achieved when using the Comet optimizer by running the search script using comet optimize.

The hyper-parameter search can be run in parallel by specifying the -j parameter when using the command line function comet optimize. For example if you wanted to train two models in parallel you can use:

comet optimize -j 2 training_script.py optimizer.config

You can lean more about how to run the Optimizer in parallel in the Run in parallel page.

End-to-end example

This Colab Notebook is an end-to-end program using Keras with the Comet Optimizer.

Open In Colab

Comet optimize

comet is a command-line utility that is installed with comet_ml. optimize is one of the commands that comet can use. The format is:

$ comet optimize [options] [PYTHON_SCRIPT] OPTIMIZER

For more information on comet optimize, see Comet Command-Line Utilities.

Learn more

Apr. 29, 2024