Optimizer

The Optimizer class. Used to perform a search for minimum or maximum loss for a given set of parameters. Also used for a grid or random sweep of parameter space.

Note that any keyword argument not in the following will be passed onto the Experiment constructor. For example, you can pass project_name and logging arguments by listing them here.

Args:

  • config: optional, if COMET_OPTIMIZER_ID is configured, otherwise is either a config dictionary, optimizer id, or a config filename.
  • trials: int (optional, default 1) number of trials per parameter set to test.
  • verbose: boolean (optional, default 1) verbosity level where 0 means no output, and 1 (or greater) means to show more detail.
  • experiment_class: string or callable (optional, default None), class to use (for example, OfflineExperiment).

Examples:

```python

Assume COMET_OPTIMIZER_ID is configured:

opt = Optimizer()

An optimizer config dictionary:

opt = Optimizer({"algorithm": "bayes", ...})

An optimizer id:

opt = Optimizer("73745463463")

A filename to a optimizer config file:

opt = Optimizer("/tmp/mlhacker/optimizer.config") ```

To pass arguments to the Experiment constructor, pass them into the opt.get_experiments() call, like so:

```python

opt = Optimizer("/tmp/mlhacker/optimizer.config") for experiment in opt.get_experiments( ... project_name="my-project", ... auto_metric_logging=False, ... ): ... loss = fit(model) ... experiment.log_metric("loss", loss") ... experiment.end() ```


Optimizer.__init__

python __init__(self, config=None, trials=None, verbose=1, experiment_class="Experiment", api_key=None, **kwargs)

The Optimizer constructor.

Args:

  • config: (optional, if COMET_OPTIMIZER_ID is configured). Can an optimizer config id, an optimizer config dictionary, or an optimizer config filename.
  • trials: int (optional, default 1) number of trials per parameter value set
  • verbose: int (optional, default 1) level of details to show; 0 means quiet
  • experiment_class: string (optional). Supported values are "Experiment" (the default) to use online Experiments or "OfflineExperiment" to use offline Experiments. It can also be a callable (a function or a method) that returns an instance of Experiment, OfflineExperiment, ExistingExperiment or ExistingOfflineExperiment.

See above for examples.


Optimizer.end

python end(self, experiment)

Optimizer.end() is called at end of experiment. Usually, you would not call this manually, as it is called directly when the experiment ends.


Optimizer.get_experiments

python get_experiments(self, **kwargs)

Optimizer.get_experiments() will iterate over all possible experiments for this sweep or search, n at a time. All experiments will have a unique set of parameter values (unless performing multiple trials per parameter set).

Example:

```python

for experiment in optimizer.get_experiments(): ... loss = fit(x, y) ... experiment.log_metric("loss", loss) ```


Optimizer.get_id

python get_id(self)

Get the id of this optimizer, with Comet config variable.

Example:

```

opt.get_id() COMET_OPTIMIZER_ID=87463746374647364 ```


Optimizer.get_parameters

python get_parameters(self)

Optimizer.get_parameters() will iterate over all possible parameters for this sweep or search. All parameters combinations will be emitted once (unless performing multiple trials per parameter set).

Example:

```python

for parameters in optimizer.get_parameters(): ... experiment = comet_ml.Experiment() ... loss = fit(x, y) ... experiment.log_metric("loss", loss) ```


Optimizer.next

python next(self, **kwargs)

Optimizer.next() will return the next experiment for this sweep or search. All experiments will have a unique set of parameter values (unless performing multiple trials per parameter set).

Normally, you would not call this directly, but use the generator Optimizer.get_experiments()

Args:

kwargs: (optional). Any keyword argument will be passed to the

  Experiment class for creation. The API key is passed directly.

Example:

```python

experiment = optimizer.next() if experiment is not None: ... loss = fit(x, y) ... experiment.log_metric("loss", loss) ```


Optimizer.next_data

python next_data(self)

Optimizer.next_data() will return the next parameters in the Optimizer sweep.

Normally, you would not call this directly, but use the generator Optimizer.get_parameters()

Example:

```python

experiment = optimizer.next_data() ```


Optimizer.status

python status(self)

Get the status from the optimizer server for this optimizer.

Example:

```

opt.status() {'algorithm': 'grid', 'comboCount': 32, 'configSpaceSize': 10, 'endTime': None, 'id': 'c20b90ecad694e71bdb5702778eb8ac7', 'lastUpdateTime': None, 'maxCombo': 0, 'name': 'c20b90ecad694e71bdb5702778eb8ac7', 'parameters': {'x': {'max': 20, 'min': 0, 'scalingType': 'uniform', 'type': 'integer'}}, 'retryCount': 0, 'spec': {'gridSize': 10, 'maxCombo': 0, 'metric': 'loss', 'minSampleSize': 100, 'randomize': False, 'retryLimit': 20, 'seed': 2997936454}, 'startTime': 1558793292216, 'state': { 'sequence_i': 0, 'sequence_retry': 11, 'sequence_trial': 0, 'total': 32, }, 'status': 'running', 'trials': 1, 'version': '1.0.0'} ```