AI Experiments
Optimize Your Model
A key component of the SigOpt Platform is the ability to go from tracking your model with SigOpt Runs to optimizing that very same model with minimal changes to your code.
At a high level, a SigOpt AI Experiment is a grouping of SigOpt Runs and is defined by user-defined parameter and metric spaces. A SigOpt AI Experiment has a budget that is used to determine the number of hyperparameter tuning loops to conduct. Each hyperparameter loop produces a SigOpt Run with suggested assignments for each parameter. Different sets of hyperparameter values are suggested by either SigOpt algorithms and/or the user with the goal of finding the optimal set(s) of hyperparameters. Overtime, when using the SigOpt Optimizer, you can expect your model to perform better on your metrics.
The Optimization Loop
There are 3 core steps in the optimization loop:
Create a SigOpt AI Experiment
Iterate over your AI Experiment
You can iterate over your AI Experiment in 2 ways. Each optimization loop produces a SigOpt Run with suggested assignments for each parameter.
Report metric values to SigOpt
Putting It All Together
Last updated