Intelligent Optimization
With SigOpt you don’t have to leave tuning until the last minute. Our solution fully integrates automated hyperparameter tuning with training runs tracking to give you a sense of the bigger picture and the path to reach your best model. With features like highly customizable search spaces and multimetric optimization, SigOpt can advance your model with a simple API for sophisticated hyperparameter tuning before taking it into production.

SigOpt Experiments

A SigOpt Experiment is an automated search of your model's hyperparameter space. A SigOpt Experiment works as follows:
You start out by defining the hyperparameter and metric space. From there you use the SigOpt API to request SigOpt Runs, where the SigOpt Optimizer intelligently suggests hyperparameter configurations for you to try out in order to come up with the best configurations for your model.
SigOpt Experiments support the following types of optimization:
  • Grid Search: execute grid search with SigOpt
  • Random Search: execute random search with SigOpt
  • SigOpt Search: use our proprietary and word-class Bayesian Optimizer to search your parameter space and find the most performant parameter values
  • All Constraint Search: leverage our Bayesian Optimizer to emphasize exploring parameter space, by defining metrics as constraints (guardrails) instead of optimization objectives.
  • Bring your own optimizer and track with SigOpt: use your preferred optimizer or test alternatives and track it all consistently to visualize in SigOpt’s web dashboard.
For a complete list of functionality see the API Reference or go to the SigOpt Experiment docs.
SigOpt Experiments can be recorded by integrating code snippets into Python that you run in a notebook or via the command line.
Copy link