Define and Set Up Metric Space
Similar to defining the hyperparameter space, you need to define the metric space to run a SigOpt Experiment.
The entire metric space is defined as a list of objects, where each object represents a metric.
Each metric is given a name
, an objective
and a strategy
. Metrics with a strategy of constraint
must have a threshold
. Metrics with a strategy of optimize
can optionally be assigned a threshold
.
name: str
name: str
The metric name is a string used to keep track of the different metrics in the experiment.
objective: maximize, minimize
objective: maximize, minimize
The objective is a flag that specifies if a metric is intended to be maximized (default) or minimized by either setting maximize
or minimize
.
strategy: optimize, constraint, store
strategy: optimize, constraint, store
SigOpt allows three different types of metrics: optimized (default), constrained and stored. They are invoked by calling optimize
, constraint,
and store
respectively.
Optimize
metrics are for finding a minimum or a maximum value. SigOpt handles up to two optimized metrics - for more information visit multimetric optimization.
Each optimized metric has the ability to specify a threshold which is a floating point number representing a threshold for success - for more information visit metric threshold.
Constraint
metrics are for defining success thresholds without the need to find a minimum or maximum value. SigOpt handles up to four constraints - for more information visit metric constraints. The constraint is specified as a threshold
which is a floating point number representing a threshold for success.
Store
metrics are for tracking purposes. SigOpt can handle up to 50 stored metrics per training run or experiment.
Metric Failures
Observations or Runs report the value of your metric evaluated on a given set of parameters. However, sometimes a set of parameters is not feasible for evaluation. We call this a metric failure since the model failed to obtain a metric. SigOpt has the ability to handle and track such Failures to learn the feasible regions of your domain.
Cases we recommend marking as Failures:
Evaluating a metric is not possible because the SigOpt provided parameter assignments are infeasible/ not of interest
Certain parameter configurations lead to out-of-memory error or run-time errors
If model training abruptly stops because a machine randomly fails, we recommend deleting that Run or Observation instead of marking it as failed.
Core Module
AI Module
If an infeasible region of the parameter space is known beforehand, it may be possible to predefine with Parameter Constraints. In situations in which feasibility is defined through thresholding on auxiliary non-optimized metric values, it may be more beneficial to use Metric Constraints.
Last updated