Links

Define and Set Up Metric Space

Similar to defining the hyperparameter space, you need to define the metric space to run a SigOpt Experiment.
The entire metric space is defined as a list of objects, where each object represents a metric.
Each metric is given a name, an objective and a strategy. Metrics with a strategy of constraint must have a threshold. Metrics with a strategy of optimize can optionally be assigned a threshold.

name: str

The metric name is a string used to keep track of the different metrics in the experiment.

objective: maximize, minimize

The objective is a flag that specifies if a metric is intended to be maximized (default) or minimized by either setting maximize or minimize.
Python
YAML
dict(
name="name",
objective="maximize",
strategy="optimize"
)
metrics:
- name: name
objective: maximize
strategy: optimize
Python
YAML
dict(
name="name",
objective="minimize",
strategy="optimize"
)
metrics:
- name: name
objective: minimize
strategy: optimize

strategy: optimize, constraint, store

SigOpt allows three different types of metrics: optimized (default), constrained and stored. They are invoked by calling optimize, constraint, and store respectively.
Optimize metrics are for finding a minimum or a maximum value. SigOpt handles up to two optimized metrics - for more information visit multimetric optimization.
Each optimized metric has the ability to specify a threshold which is a floating point number representing a threshold for success - for more information visit metric threshold.
Python
YAML
dict(
name="name",
objective="minimize",
strategy="optimize"
)
metrics:
- name: name
objective: minimize
strategy: optimize
Python
YAML
dict(
name="name",
objective="minimize",
threshold=100,
strategy="optimize"
)
metrics:
- name: name
objective: minimize
threshold: 100
strategy: optimize
Constraint metrics are for defining success thresholds without the need to find a minimum or maximum value. SigOpt handles up to four constraints - for more information visit metric constraints. The constraint is specified as a threshold which is a floating point number representing a threshold for success.
Python
YAML
dict(
name="name",
objective="minimize",
threshold=100,
strategy="constraint"
)
metrics:
- name: name
objective: minimize
threshold: 100
strategy: constraint
Store metrics are for tracking purposes. SigOpt can handle up to 50 stored metrics per training run or experiment.
Python
YAML
dict(
name="name",
objective="maximize",
strategy="store"
)
metrics:
- name: name
objective: maximize
strategy: store

Metric Failures

Observations or Runs report the value of your metric evaluated on a given set of parameters. However, sometimes a set of parameters is not feasible for evaluation. We call this a metric failure since the model failed to obtain a metric. SigOpt has the ability to handle and track such Failures to learn the feasible regions of your domain.
Cases we recommend marking as Failures:
  • Evaluating a metric is not possible because the SigOpt provided parameter assignments are infeasible/ not of interest
  • Certain parameter configurations lead to out-of-memory error or run-time errors
If model training abruptly stops because a machine randomly fails, we recommend deleting that Run or Observation instead of marking it as failed.

Core Module

Python
Java
from sigopt import Connection
conn = Connection(client_token="USER_TOKEN")
observation = conn.experiments(EXPERIMENT_ID).observations().create(
failed=True,
suggestion="SUGGESTION_ID"
)
import com.sigopt.SigOpt;
import com.sigopt.exception.SigoptException;
import com.sigopt.model.*;
import java.util.Arrays;
public class YourSigoptExperiment {
public static Experiment createExperiment() throws SigoptException {
Observation observation = new Experiment(EXPERIMENT_ID).observations().create()
.data(
new Observation.Builder()
.failed(true)
.suggestion("SUGGESTION_ID")
.build()
)
.call();
return experiment;
}

AI Module

sigopt.log_failure()
If an infeasible region of the parameter space is known beforehand, it may be possible to predefine with Parameter Constraints. In situations in which feasibility is defined through thresholding on auxiliary non-optimized metric values, it may be more beneficial to use Metric Constraints.