Main Concepts

Experiment

An Experiment consists of a set of parameters that you are searching over and the associated metric/metrics that you are optimizing. Defining the Experiment is crucial to the success of your optimization problem. The following subsections list out the key components to consider when creating an Experiment: the parameters, the metrics, and the observation budget.

Parameters

Parameters are a crucial part of every SigOpt experiment, defining the domain to be searched. SigOpt supports double, integer, and categorical parameter types.

Defining a meaningful domain is an important step towards a successful application. Sometimes, our users have more knowledge about how parameters relate and how they relate to the optimization problem. Leveraging that knowledge can help to improve performance and obtain better results.

Therefore, in addition to parameter types, SigOpt provides several features and recommendations that can help to construct a domain for a specific problem.

  • Grid Values - optimize over a specific set of numerical values.

  • Logarithmic Parameter Transformation - search a parameter on a logarithmic scale.

  • Prior Beliefs - define the distribution of a parameter.

  • Parameter Constraints - restrict the parameter space in with linear inequality constraints.

Refer to the Define and Set Up Parameter Space page for more information on how to specify the parameter space in an Experiment.

Metrics

SigOpt Experiments can optimize any real-valued objective function. It is important to understand which metric or metrics to optimize. Moreover, it can also be helpful to track metrics even when they are not being optimized. SigOpt provides several features and recommendations that can help with dealing with multiple metrics.

For more information on how to specify the metric space in an Experiment, refer to the Define and Set Up Metric Space page.

Observation Budget

The number of Observations you plan to create for this experiment is another key to success. For users who are severely limited by the available compute resources and the long evaluation time, a smaller parameter space could lead to a "faster" optimization process. Additionally, providing information such as Prior could also help with making the optimization more efficient.

Users can also go beyond the observation budget if they feel that there is more improvement to be gained from additional evaluations.

Since SigOpt takes the observation budget into consideration when making suggestions, we highly recommend users to be close to the observation budget they set when creating the experiment. While it is the user's discretion to set an appropriate observation budget, SigOpt has some general recommendation on how to set the observation budget,

Suggestion

A Suggestion is a representation of the parameters that SigOpt has suggested. It contains a list of parameters and their respective suggested values. The user can request a new suggestion from SigOpt at any time.

suggestion = conn.experiments(EXPERIMENT_ID).suggestions().create()

For AI Module users, the Run object subsumes both Suggestion and Observation objects. Please refer to the AI Module specific optimization page on how to run an optimization experiment.

Observation

An Observation represents the observed data from a single trial (or evaluation) in your Experiment.

When the metric has been evaluated (for the suggested parameter values), the user can report an Observation.

observation = conn.experiments(EXPERIMENT_ID).observations().create(
  suggestion=SUGGESTION_ID,
  values=[
    dict(
      name="metric_1",
      value=0.95
      ),
    dict(
      name="metric_2",
      value=123.45
      )
    ]
  )

For AI Module users, the Run object subsumes both Suggestion and Observation objects. Please refer to the AI Module specific optimization page on how to run an optimization experiment.

Optimization Loop

The optimization loop is the backbone of using SigOpt. After creating your experiment, run through these three simple steps, in a loop:

  • Receive a Suggestion from SigOpt

  • Evaluate your metrics

  • Report an Observation to SigOpt

Here is what the full optimization loop may look like for a SigOpt experiment.

experiment = conn.experiments(EXPERIMENT_ID).fetch()

# Run the Optimization Loop until the Observation Budget is exhausted
while experiment.progress.observation_count < experiment.observation_budget:
  # Receive a suggestion
  suggestion = conn.experiments(experiment.id).suggestions().create()

  # Evaluate your metric
  values_dict = evaluate_metric(suggestion.assignments, dataset)

  # Report an observation
  conn.experiments(experiment.id).observations().create(
    suggestion=suggestion.id,
    values=values_dict,
  )

  # Update the experiment object
  experiment = conn.experiments(experiment.id).fetch()

Last updated