Metric Thresholds

In Multimetric Experiments, SigOpt searches for the Pareto Frontier; these are the parameter assignments whose metric values cannot be improved uniformly. The final outcome of such an experiment is the subset of all observations which have efficient metric values. However, it may be the case that only a subset of the Pareto Frontier is actually relevant for a given application. Some points on the Pareto Frontier may not be viable because unique business requirements force one or both metrics to meet custom criteria.

The goal of Metric Thresholds is to allow you to provide a region of metric values which will define success for you. SigOpt’s Optimizer will use this information to focus on this region with the intent of better resolving the Pareto Frontier to satisfy these stated thresholds. If you want to place thresholds on non-optimized (guardrail) metrics, see Metric Constraints.

Example: Balancing Accuracy with Speed

Neural networks have great capacity to perform well in complicated machine learning situations; however, this may come at the cost of creating a very complex model with a slow inference time. We discussed this situation in an earlier blog, and used the Metric Thresholds feature to state these practical limitations at experiment creation to direct the SigOpt search on the regions of interest.

Defining the Metric Thresholds

To assign a threshold to one or more of your metrics, specify the threshold field in the desired Metric object when creating your experiment. You can specify a threshold for any number of metrics in your multimetric experiment. To assign thresholds after an experiment has been created, update the metrics field, as shown below. SigOpt will interpret the specified threshold in the following ways.

  • By default, SigOpt will assume you want to maximize each metric and try to find observations that are greater than or equal to the specified threshold.

  • If you provide an objective of “minimize” for a metric, SigOpt will try to find observations that are less than or equal to the specified threshold.

Core Module

from sigopt import Connection

conn = Connection(client_token="USER_TOKEN")
experiment = conn.experiments().create(
  name="Neural network",
  parameters=[
    dict(
      name="learning_rate",
      bounds=dict(
        min=0.0001,
        max=1
        ),
      transformation="log",
      type="double"
      ),
    dict(
      name="nodes_per_layer",
      bounds=dict(
        min=5,
        max=20
        ),
      type="int"
      )
    ],
  metrics=[
    dict(
      name="inference_time_milliseconds",
      objective="minimize",
      threshold=100
      ),
    dict(
      name="validation_accuracy",
      objective="maximize"
      )
    ],
  observation_budget=65,
  parallel_bandwidth=2,
  type="offline"
  )
print("Created experiment: https://app.sigopt.com/experiment/" + experiment.id)

AI Module

sigopt.create_experiment(
  name="Multimetric optimization with a threshold",
  type="offline",
  parameters=[
    dict(
      name="hidden_layer_size",
      type="int",
      bounds=dict(
        min=32,
        max=128
      )
    ),
    dict(
      name="activation_fn",
      type="categorical",
      categorical_values=[
        "relu",
        "tanh"
      ]
    )
  ],
  metrics=[
    dict(
      name="holdout_accuracy",
      strategy="optimize",
      objective="maximize"
    ),
    dict(
      name="inference_time",
      strategy="optimize",
      objective="minimize",
      threshold=0.1
    )
  ],
  parallel_bandwidth=1,
  budget=30
)

Updating Metric Thresholds

Metric thresholds can be updated on the properties page of an experiment at any time while the experiment is in progress.

Core Module

For core module users, the thresholds can also be updated directly via the API. An example is given below.

experiment = conn.experiments(experiment.id).update(
  metrics=[
    dict(
      name="inference_time_milliseconds",
      threshold=85
      ),
    dict(
      name="validation_accuracy",
      threshold=0.79
      )
    ]
  )

Notes on Feasibility

If the defined metric thresholds don’t feasibly exist anywhere in the domain, this feature will perform unpredictably. SigOpt does not know beforehand what is feasible, so setting unrealistic goals hinders our ability to exploit the information we gain to optimize towards the threshold you’ve specified. For example, if the accuracy threshold for a machine learning model is set to 984, when the actual accuracy exists between 0 and 1, this feature will assume the value 984 is actually achievable and explore the domain erratically trying to find solutions that satisfy this threshold.

When defining thresholds, it is best

  • to state well-established thresholds at the start of an experiment if you have the data, or

  • to update the experiment after some feasible observations have been observed.

Limitations

Last updated