# Main Concepts

## Experiment

An Experiment consists of a set of parameters that you are searching over and the associated metric/metrics that you are optimizing. Defining the Experiment is crucial to the success of your optimization problem. The following subsections list out the key components to consider when creating an Experiment: the parameters, the metrics, and the observation budget.

### Parameters

Parameters are a crucial part of every SigOpt experiment, defining the domain to be searched. SigOpt supports double, integer, and categorical parameter types.
Defining a meaningful domain is an important step towards a successful application. Sometimes, our users have more knowledge about how parameters relate and how they relate to the optimization problem. Leveraging that knowledge can help to improve performance and obtain better results.
Therefore, in addition to parameter types, SigOpt provides several features and recommendations that can help to construct a domain for a specific problem.
• Grid Values - optimize over a specific set of numerical values.
• Logarithmic Parameter Transformation - search a parameter on a logarithmic scale.
• Prior Beliefs - define the distribution of a parameter.
• Parameter Constraints - restrict the parameter space in with linear inequality constraints.
Refer to the Define and Set Up Parameter Space page for more information on how to specify the parameter space in an Experiment.

### Metrics

SigOpt Experiments can optimize any real-valued objective function. It is important to understand which metric or metrics to optimize. Moreover, it can also be helpful to track metrics even when they are not being optimized. SigOpt provides several features and recommendations that can help with dealing with multiple metrics.
For more information on how to specify the metric space in an Experiment, refer to the Define and Set Up Metric Space page.

### Observation Budget

The number of Observations you plan to create for this experiment is another key to success. For users who are severely limited by the available compute resources and the long evaluation time, a smaller parameter space could lead to a "faster" optimization process. Additionally, providing information such as Prior could also help with making the optimization more efficient.
Users can also go beyond the observation budget if they feel that there is more improvement to be gained from additional evaluations.
Since SigOpt takes the observation budget into consideration when making suggestions, we highly recommend users to be close to the observation budget they set when creating the experiment. While it is the user's discretion to set an appropriate observation budget, SigOpt has some general recommendation on how to set the observation budget,

## Suggestion

A Suggestion is a representation of the parameters that SigOpt has suggested. It contains a list of parameters and their respective suggested values. The user can request a new suggestion from SigOpt at any time.
Python
Bash
Java
suggestion = conn.experiments(EXPERIMENT_ID).suggestions().create()
SUGGESTION=`curl -s -X POST https://api.sigopt.com/v1/experiments/\$EXPERIMENT_ID/suggestions -u "\$SIGOPT_API_TOKEN":`
Suggestion suggestion = new Experiment(EXPERIMENT_ID).suggestions().create().call();
For AI Module users, the Run object subsumes both Suggestion and Observation objects. Please refer to the AI Module specific optimization page on how to run an optimization experiment.

## Observation

An Observation represents the observed data from a single trial (or evaluation) in your Experiment.
When the metric has been evaluated (for the suggested parameter values), the user can report an Observation.
Python
Bash
Java
observation = conn.experiments(EXPERIMENT_ID).observations().create(
suggestion=SUGGESTION_ID,
values=[
dict(
name="metric_1",
value=0.95
),
dict(
name="metric_2",
value=123.45
)
]
)
metric_1=0.95
metric_2=123.45
OBSERVATION=`curl -s -X POST https://api.sigopt.com/v1/experiments/\$EXPERIMENT_ID/observations -u "\$SIGOPT_API_TOKEN": \
-H 'Content-Type: application/json' \
-d "{\"suggestion\":\"\$SUGGESION_ID\",\"values\":[{\"name\":\"metric_1\",\"value\":\$metric_1},{\"name\":\"metric_2\",\"value\":\$metric_2}]}"`
Observation observation = new Experiment(EXPERIMENT_ID).observations().create()
.data(
new Observation.Builder()
.suggestion(SUGGESTION_ID)
.values(java.util.Arrays.asList(
new Value.Builder()
.name("metric_1")
.value(0.95)
.build(),
new Value.Builder()
.name("metric_2")
.value(123.45)
.build()
))
.build()
)
.call();
For AI Module users, the Run object subsumes both Suggestion and Observation objects. Please refer to the AI Module specific optimization page on how to run an optimization experiment.

## Optimization Loop

The optimization loop is the backbone of using SigOpt. After creating your experiment, run through these three simple steps, in a loop:
• Receive a Suggestion from SigOpt
• Report an Observation to SigOpt
Here is what the full optimization loop may look like for a SigOpt experiment.
Python
Bash
Java
experiment = conn.experiments(EXPERIMENT_ID).fetch()
# Run the Optimization Loop until the Observation Budget is exhausted
while experiment.progress.observation_count < experiment.observation_budget:
suggestion = conn.experiments(experiment.id).suggestions().create()
values_dict = evaluate_metric(suggestion.assignments, dataset)
# Report an observation
conn.experiments(experiment.id).observations().create(
suggestion=suggestion.id,
values=values_dict,
)
# Update the experiment object
experiment = conn.experiments(experiment.id).fetch()
The following Bash script uses the `jq` package for processing JSON.
EXPERIMENT=`curl -s -X GET https://api.sigopt.com/v1/experiments/\$EXPERIMENT_ID -u "\$SIGOPT_API_TOKEN":`
BUDGET=`echo \$EXPERIMENT | jq '.observation_budget'`
OBSERVATION_COUNT=`echo \$EXPERIMENT | jq '.progress.observation_count'`
while [ \$OBSERVATION_COUNT -lt \$BUDGET ]; do
SUGGESTION=`curl -s -X POST https://api.sigopt.com/v1/experiments/\$EXPERIMENT_ID/suggestions -u "\$SIGOPT_API_TOKEN":`
# Parse the suggestion and extract the parameters
SUGGESTION_ID=`echo \$SUGGESTION | jq -r '.id'`
parameter_1=`echo \$SUGGESTION | jq '.assignments.parameter_1'`
parameter_2=`echo \$SUGGESTION | jq '.assignments.parameter_2'`
# Evaluate the SigOpt suggested parameters with a `evaluate_metric` script
metric_1 = ./evaluate_metric \$parameter_1 \$parameter_2
# Report the observation
curl -s -X POST https://api.sigopt.com/v1/experiments/\$EXPERIMENT_ID/observations -u "\$SIGOPT_API_TOKEN": \
-H 'Content-Type: application/json' \
-d "{\"suggestion\":\$SUGGESTION_ID,\"values\":[{\"name\":\"metric_1\",\"value\":\$metric_1}]}" > /dev/null
# Update experiment status
EXPERIMENT=`curl -s -X GET https://api.sigopt.com/v1/experiments/\$EXPERIMENT_ID -u "\$SIGOPT_API_TOKEN":`
OBSERVATION_COUNT=`echo \$EXPERIMENT | jq '.progress.observation_count'`
done
Experiment experiment = Experiment.fetch(EXPERIMENT_ID).call();
while (experiment.getProgress().getObservationCount() < experiment.getObservationBudget()) {
Suggestion suggestion = experiment.suggestions().create().call();
double value = evaluateModel(suggestion.getAssignments());
Observation observation = new Experiment(EXPERIMENT_ID).observations().create()
.data(
new Observation.Builder()
.suggestion(suggestion.getID())
.values(java.util.Arrays.asList(
new Value.Builder()
.name("metric_1")
.value(value)
.build()
))
.build()
)
.call();
// Update the experiment object
experiment = Experiment.fetch(experiment.getId()).call();
}