Skip to content

integration.vertex

initialize_comet_logger

comet_ml.integration.vertex.initialize_comet_logger(experiment,
    pipeline_job_name, pipeline_task_name, pipeline_task_uuid)

Logs the Vertex task identifiers needed to track your pipeline status in Comet.ml. You need to call this function from every component you want to track with Comet.ml.

Args:

  • experiment: An already created Experiment object.
  • pipeline_job_name: string. The Vertex pipeline job name, see below how to get it automatically from Vertex.
  • pipeline_task_name: string. The Vertex task name, see below how to get it automatically from Vertex.
  • pipeline_task_uuid: string. The Vertex task unique id, see below how to get it automatically from Vertex.

For example:

@kfp.dsl.v2.component
def my_component() -> None:
    import comet_ml.integration.vertex

    experiment = comet_ml.Experiment()
    pipeline_run_name = "{{$.pipeline_job_name}}"
    pipeline_task_name = "{{$.pipeline_task_name}}"
    pipeline_task_id = "{{$.pipeline_task_uuid}}"

    comet_ml.integration.Vertex.initialize_comet_logger(experiment, pipeline_run_name, pipeline_task_name, pipeline_task_id)

comet_logger_component

comet_ml.integration.vertex.comet_logger_component(api_key=None,
    project_name=None, workspace=None, packages_to_install=None,
    base_image=None, custom_experiment=None)

Inject the Comet Logger component which continuously track and report the current pipeline status to Comet.ml.

Args:

  • api_key: string, optional. Your Comet API Key, if not provided, the value set in the configuration system will be used.

  • project_name: string, optional. The project name where all pipeline tasks are logged. If not provided, the value set in the configuration system will be used.

  • workspace: string, optional. The workspace name where all pipeline tasks are logged. If not provided, the value set in the configuration system will be used.

  • packages_to_install: List of string, optional. Which packages to install, given directly to kfp.components.create_component_from_func. Default is ["google-cloud-aiplatform", "comet_ml"].

  • base_image: string, optional. Which docker image to use. If not provided, the default Kubeflow base image will be used.

  • custom_experiment: Experiment, optional. The Comet Experiment with custom configuration which you can provide to be used instead of Experiment which would be implicitly created with default options.

Example:

@dsl.pipeline(name='ML training pipeline')
def ml_training_pipeline():
    import comet_ml.integration.vertex

    comet_ml.integration.vertex.comet_logger_component()
Mar. 21, 2023