Metric Thresholds
In Multimetric Experiments, SigOpt searches for the Pareto Frontier; these are the parameter assignments whose metric values cannot be improved uniformly. The final outcome of such an experiment is the subset of all observations which have efficient metric values. However, it may be the case that only a subset of the Pareto Frontier is actually relevant for a given application. Some points on the Pareto Frontier may not be viable because unique business requirements force one or both metrics to meet custom criteria.
The goal of Metric Thresholds is to allow you to provide a region of metric values which will define success for you. SigOpt’s Optimizer will use this information to focus on this region with the intent of better resolving the Pareto Frontier to satisfy these stated thresholds. If you want to place thresholds on non-optimized (guardrail) metrics, see Metric Constraints.

The SigOpt optimized efficient frontier is illustrated by the orange points (left), but not all solutions are relevant. By placing a threshold on the metric of inference_time (right), SigOpt can focus on sampling more promising regions.
Neural networks have great capacity to perform well in complicated machine learning situations; however, this may come at the cost of creating a very complex model with a slow inference time. We discussed this situation in an earlier blog, and used the Metric Thresholds feature to state these practical limitations at experiment creation to direct the SigOpt search on the regions of interest.
To assign a threshold to one or more of your metrics, specify the
threshold
field in the desired Metric object when creating your experiment. You can specify a threshold for any number of metrics in your multimetric experiment. To assign thresholds after an experiment has been created, update the metrics
field, as shown below. SigOpt will interpret the specified threshold in the following ways.- By default, SigOpt will assume you want to maximize each metric and try to find observations that are greater than or equal to the specified threshold.
- If you provide an objective of “minimize” for a metric, SigOpt will try to find observations that are less than or equal to the specified threshold.
Python
Bash
Java
from sigopt import Connection
conn = Connection(client_token="USER_TOKEN")
experiment = conn.experiments().create(
name="Neural network",
parameters=[
dict(
name="learning_rate",
bounds=dict(
min=0.0001,
max=1
),
transformation="log",
type="double"
),
dict(
name="nodes_per_layer",
bounds=dict(
min=5,
max=20
),
type="int"
)
],
metrics=[
dict(
name="inference_time_milliseconds",
objective="minimize",
threshold=100
),
dict(
name="validation_accuracy",
objective="maximize"
)
],
observation_budget=65,
parallel_bandwidth=2,
type="offline"
)
print("Created experiment: https://app.sigopt.com/experiment/" + experiment.id)
EXPERIMENT=curl -s -X POST https://api.sigopt.com/v1/experiments -u "$SIGOPT_API_TOKEN": \
-H "Content-Type: application/json" -d "`cat experiment_meta.json`"
JSON file defining the Experiment:
experiment_meta.json
{
"name": "Neural network",
"parameters": [
{
"name": "learning_rate",
"bounds": {
"min": 0.0001,
"max": 1
},
"transformation": "log",
"type": "double"
},
{
"name": "nodes_per_layer",
"bounds": {
"min": 5,
"max": 20
},
"type": "int"
}
],
"metrics": [
{
"name": "inference_time_milliseconds",
"objective": "minimize",
"threshold": 100
},
{
"name": "validation_accuracy",
"objective": "maximize"
}
],
"observation_budget": 65,
"parallel_bandwidth": 2,
"type": "offline"
}
import com.sigopt.SigOpt;
import com.sigopt.exception.SigoptException;
import com.sigopt.model.*;
import java.util.Arrays;
public class YourSigoptExperiment {
public static Experiment createExperiment() throws SigoptException {
Experiment experiment = Experiment.create()
.data(
new Experiment.Builder()
.name("Neural network")
.parameters(java.util.Arrays.asList(
new Parameter.Builder()
.name("learning_rate")
.bounds(new Bounds.Builder()
.min(0.0001)
.max(1)
.build())
.transformation("log")
.type("double")
.build(),
new Parameter.Builder()
.name("nodes_per_layer")
.bounds(new Bounds.Builder()
.min(5)
.max(20)
.build())
.type("int")
.build()
))
.metrics(java.util.Arrays.asList(
new Metric.Builder()
.name("inference_time_milliseconds")
.objective("minimize")
.threshold(100)
.build(),
new Metric.Builder()
.name("validation_accuracy")
.objective("maximize")
.build()
))
.observationBudget(65)
.parallelBandwidth(2)
.type("offline")
.build()
)
.call();
return experiment;
}
Python
YAML
sigopt.create_experiment(
name="Multimetric optimization with a threshold",
type="offline",
parameters=[
dict(
name="hidden_layer_size",
type="int",
bounds=dict(
min=32,
max=128
)
),
dict(
name="activation_fn",
type="categorical",
categorical_values=[
"relu",
"tanh"
]
)
],
metrics=[
dict(
name="holdout_accuracy",
strategy="optimize",
objective="maximize"
),
dict(
name="inference_time",
strategy="optimize",
objective="minimize",
threshold=0.1
)
],
parallel_bandwidth=1,
budget=30
)
name: Multimetric optimization with a threshold
type: offline
parameters:
- name: hidden_layer_size
type: int
bounds:
min: 32
max: 128
- name: activation_fn
type: categorical
categorical_values:
- relu
- tanh
metrics:
- name: holdout_accuracy
strategy: optimize
objective: maximize
- name: inference_time
strategy: optimize
objective: minimize
threshold: 0.1
parallel_bandwidth: 1
budget: 30
Metric thresholds can be updated on the properties page of an experiment at any time while the experiment is in progress.
For core module users, the thresholds can also be updated directly via the API. An example is given below.
Python
Bash
Java
experiment = conn.experiments(experiment.id).update(
metrics=[
dict(
name="inference_time_milliseconds",
threshold=85
),
dict(
name="validation_accuracy",
threshold=0.79
)
]
)
EXPERIMENT=`curl -s -X PUT https://api.sigopt.com/v1/experiments/EXPERIMENT_ID -u "$SIGOPT_API_TOKEN": \
-H 'Content-Type: application/json' \
-d "{\"metrics\":[{\"name\":\"inference_time_milliseconds\",\"threshold\":85},{\"name\":\"validation_accuracy\",\"threshold\":0.79}]}"`
import com.sigopt.SigOpt;
import com.sigopt.exception.SigoptException;
import com.sigopt.model.*;
import java.util.Arrays;
Experiment.update(experiment.getId())
.data(
new Experiment.Builder()
.metrics(java.util.Arrays.asList(
new Metric.Builder()
.name("inference_time_milliseconds")
.threshold(85)
.build(),
new Metric.Builder()
.name("validation_accuracy")
.threshold(0.79)
.build()
))
.build()
)
.call();
If the defined metric thresholds don’t feasibly exist anywhere in the domain, this feature will perform unpredictably. SigOpt does not know beforehand what is feasible, so setting unrealistic goals hinders our ability to exploit the information we gain to optimize towards the threshold you’ve specified. For example, if the accuracy threshold for a machine learning model is set to
984
, when the actual accuracy exists between 0 and 1
, this feature will assume the value 984
is actually achievable and explore the domain erratically trying to find solutions that satisfy this threshold.When defining thresholds, it is best
- to state well-established thresholds at the start of an experiment if you have the data, or
- to update the experiment after some feasible observations have been observed.
Last modified 9mo ago