Skip to content

Troubleshooting and FAQ

This page presents possible infos, warnings and errors that you might encounter when working with Comet Optimizer, and the steps to take to address them. There are also some tips on debugging.

For further assistance on any of these Python warnings or errors, or if you see an error message that is not noted here, ping us on our Slack channel.

Common infos, warnings, and errors

INFO: Optimizer metric is not logged

If Comet Optimizer cannot access the optimizer metric set in the spec, Comet displays the following message:

COMET INFO: Optimizer metrics is '<metric_name>' but no logged values found.
Experiment ignored in sweep.

but still continues the optimization process.

While this behavior is acceptable for grid and random search, make sure to correctly log the optimization metric for the bayes algorithm since this search approach requires the previous hyperparameter selection performance to perform the smart selection of the next hyperparameter values.

WARNING: Passing Experiment through Optimizer constructor is deprecated

If you manually stop an optimization job before completion and start it again, the previously running Experiment may not have been correctly ended and you may see the message:

COMET WARNING: Passing Experiment through Optimizer constructor is deprecated;
pass them to Optimizer.get_experiments or Optimizer.next

You can safely ignore this warning.

Also, make sure to refer to the Optimizer reference and the Optimizer quickstart for the latest information on how to initialize and use the Optimizer class.

ERROR: Crashed execution

Tip

These instructions also apply in case of intentional stop and resume.

By default, all of the algorithms will not release duplicate sets of parameters (except when the value of trials is greater than 1). But what should you do if an experiment crashes and never notifies the Optimizer?

You have two options:

  1. Consider setting up gracious failover against crashes (e.g., if using preemptible machines or with an unstable network connection) by setting the retryAssignLimit spec to a value greater than zero. For example:

     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    {
        "algorithm": "bayes",
        "spec": {
            "retryAssignLimit": 5,
            ...
        },
        "parameters": {...},
        "name": "My Bayesian Search",
        "trials": 1,
    }
    

    With this configuration, Comet Optimizer assigns the parameter set until an experiment marks it as "completed" or the number of retries is equal to retryAssignLimit (=5 in this example).

  2. If your optimization job crashes/terminates before completion, you can recover your Optimizer search and pick up immediately from where you left off.

    You just need to define the COMET_OPTIMIZER_ID in your environment:

    export COMET_OPTIMIZER_ID=<your-tuning-run-id>
    
    opt = comet_ml.Optimizer(
        api_key=<your-api-key>,
        config=<your-tuning-run-id>,
        project_name="example-optimizer",
    )
    

    and run your code again.

    This will resume the hyperparameter tuning with the existing Optimizer, rather than creating a new one.

    Note

    The COMET_OPTIMIZER_ID is printed by Comet as information at the start of each tuning run. For example:

    COMET INFO: COMET_OPTIMIZER_ID=366dcb4f38bf42aea6d2d87cd9601a60
    

ERROR: Out of memory

You may run out of memory if the Optimizer search space is too big.

In this case, try to lower the complexity of the parameter search space by:

  • Removing some of the parameters.
  • If the random or grid search algorithm is used, lowering the values for the gridSize and/or minSampleSize spec attributes.

Else, you will need to increase the memory allocated to the optimization job.

Debugging

Please refer to the debugging information in the Experiment Management section.

May. 17, 2024