Metric Constraints

The Metric Constraints feature combines the ideas of Metric Strategy and Metric Thresholds to give you more control over where SigOpt searches for the desirable outcomes. For example, some stored metrics might be guardrail metrics, where you want the values for these metrics to satisfy some baseline values. In other cases, some Multimetric experiments can be rephrased as a constrained single metric experiment, where you want to optimize a primary metric, subject to the secondary metric exceeding a certain threshold.

Example: Optimizing Accuracy with Constraint on Inference Time

Recall our example of the Neural Network shown in the Metric Thresholds documentation. Instead of being interested in the tradeoff between accuracy and inference time, suppose we only want to optimize for the accuracy of the network. However, we are still concerned with the inference time, so we define the inference time as a Constraint Metric with a threshold of 100ms. SigOpt allows you to state limitations at experiment creation to help inform the optimizer of the practical restrictions in your problem.

Defining the Metric Constraints

To assign one of your metrics as a constraint metric, specify the strategy field as constraint in the desired Metric object when creating your experiment. You must specify a threshold for the Constraint Metric. When the objective is set to minimize, observations with constraint metric values lower than or equal to the threshold are considered feasible. When the objective is set to maximize, observations with constraint metric values greater than or equal to the threshold are considered feasible.

Below, we create a new experiment, using the example of optimizing the accuracy of the network subject to inference time constraint.

Core Module

from sigopt import Connection

conn = Connection(client_token="USER_TOKEN")
experiment = conn.experiments().create(
  name="Neural network with inference time constraint",
  parameters=[
    dict(
      name="learning_rate",
      bounds=dict(
        min=0.0001,
        max=1
        ),
      transformation="log",
      type="double"
      ),
    dict(
      name="nodes_per_layer",
      bounds=dict(
        min=5,
        max=20
        ),
      type="int"
      )
    ],
  metrics=[
    dict(
      name="inference_time_milliseconds",
      objective="minimize",
      strategy="constraint",
      threshold=100
      ),
    dict(
      name="validation_accuracy",
      objective="maximize",
      strategy="optimize"
      )
    ],
  observation_budget=65,
  parallel_bandwidth=2,
  type="offline"
  )
print("Created experiment: https://app.sigopt.com/experiment/" + experiment.id)

AI Module

sigopt.create_experiment(
  name="Single metric optimization with constraint metrics",
  type="offline",
  parameters=[
    dict(
      name="hidden_layer_size",
      type="int",
      bounds=dict(
        min=32,
        max=512
      )
    ),
    dict(
      name="activation_fn",
      type="categorical",
      categorical_values=[
        "relu",
        "tanh"
      ]
    )
  ],
  metrics=[
    dict(
      name="holdout_accuracy",
      strategy="optimize",
      objective="maximize"
    ),
    dict(
      name="inference_time",
      strategy="constraint",
      objective="minimize",
      threshold=0.1
    )
  ],
  parallel_bandwidth=1,
  budget=30
)

Updating Constraints

Metric Constraint allows you to update the threshold on the properties page of an experiment at any time while the experiment is in progress.

Core Module

For core module users, the thresholds can also be updated directly via the API. An example is given below.

### Run the experiment above for some number of observations and lower the threshold on inference time.

experiment = conn.experiments(experiment.id).update(
  metrics=[
    dict(
      name="inference_time_milliseconds",
      threshold=50
      ),
    dict(
      name="validation_accuracy",
      )
    ],
  )

Feasibility

If the threshold defined for a Constraint Metric cannot be satisfied anywhere in the domain, this feature will perform unpredictably. For example, if you set the threshold for inference time to 0 (which is not actually possible), this feature will assume 0 ms is actually possible and explore the domain erratically trying to find solutions that satisfy this threshold. When defining constraints, it is best to state values at the start of an experiment which are well-understood based on previous experiments or prior knowledge.

Best Observations

Best Assignment List (Best Runs for AI module) will only consider results that satisfy all constraint metric thresholds. Additionally, it is possible that there will be no observation that satisfies all thresholds in the early stages of experiments, and thus no best observations. In this situation, Best Assignment List will return no best observations (an empty Pagination), and we recommend waiting until the experiment is further along to retrieve best assignments.

Recommendations

  • Set the experiment budget higher. SigOpt Experiments with Metric Constraints require additional data points to understand the associated feasible parameter space. We recommend adding 25% to your original budget for each additional constraint metric.

  • Parallelism is another powerful tool for accelerating tuning by testing multiple observations simultaneously. It can be used in conjunction with Metric Constraints, but each constraint metric requires extra time for SigOpt to compute the associated feasible parameter space. We recommend limiting parallelism to no more than 5 simultaneous observations.

Limitations

  • There can be up to 4 constraint metrics for an experiment.

  • Constraint metrics must define threshold values.

  • All specified constraint metrics must be reported for successful observations.

  • No constraint metrics can be reported for Failed observations.

  • A budget must be set for an experiment with Metric Constraints.

Last updated