Skip to content
This repository was archived by the owner on Jan 5, 2026. It is now read-only.

Commit 7fe9b33

Browse files
authored
Merge pull request #6 from SLAC-ML/pr-lxlewis-misc-tweaks
Misc tweaks
2 parents e275b95 + 1245d5a commit 7fe9b33

9 files changed

Lines changed: 184 additions & 156 deletions

File tree

docs/guides/algorithms-overview.md

Lines changed: 64 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,64 @@
1+
---
2+
sidebar_position: 4
3+
---
4+
5+
# Optimization Algorithms Overview
6+
7+
## Nelder-Mead
8+
9+
Iterative downhill simplex algorithm which seeks to find local optima by sampling initial points and then using a heuristic to choose the next point during each iteration. Nelder-Mead has been widely used inside accelerator physics.
10+
11+
**Advantages:**
12+
- Low computational cost
13+
- Historically proven performance in the context of accelerator physics
14+
- Automatic/adaptive hyperparameter specification depending on problem characteristics
15+
16+
**Disadvantages:**
17+
- Local optimizer – sensitive to initial starting conditions
18+
- Sensitive to measurement noise which can negatively impact convergence to optimum
19+
- Scales poorly to higher dimensional problems
20+
- Cannot handle observational constraints
21+
22+
## Extremum Seeking
23+
24+
Perform small oscillations to measurement to slowly move towards minimum. This algorithm uses a sinusoidal sampling strategy for each parameter to slowly drift towards optimal operating conditions and track time dependent changes in the optimal operating conditions over time. It’s useful for time dependent optimization, where short term drifts in accelerator conditions can lead to a time dependent objective function.
25+
26+
**Advantages:**
27+
- Low computational cost
28+
- Can track time-dependent drifts of the objective function to maintain an optimal operating configuration
29+
30+
**Disadvantages:**
31+
- Local optimizer, sensitive to initial starting conditions
32+
- Additional hyperparameters that must be tuned to a given optimization problem
33+
- Scales poorly to higher dimensional problems
34+
- Cannot handle observational constraints
35+
36+
## Expected Improvement (Bayesian Optimization)
37+
38+
Bayesian Optimization (BO) algorithms are machine learning-based algorithms that are particularly well suited to efficiently optimizing noisy objectives with few iterations. Using data collected during and/or prior to optimization, BO algorithms use Bayesian statistics to build a model of the objective function that predicts a distribution of possible function values at each point in parameter space. It then uses an acquisition function to make sampling decisions based on determining the global optimum of the objective function.
39+
40+
**Advantages:**
41+
- Global or local optimization depending on algorithm specifications
42+
- Creates an online surrogate model of the objective and any constraint functions, which can be used during or after optimization
43+
- Can account for observational constraints
44+
- Can incorporate rich prior information about the optimization problem to improve convergence
45+
- Explicitly handles measurement uncertainty and/or noisy objectives
46+
47+
**Disadvantages:**
48+
- Potentially significant computational costs, especially after many iterations
49+
- Numerous hyperparameters which can affect performance
50+
51+
## RCDS
52+
53+
Robust Conjugate Direction Search makes decisions via successive local approximations of the objective function to converge to an optimum. RCDS may be more efficient than Nelder-Mead but requires multiple iterations initially to establish a local model of the objective function before starting to optimize.
54+
55+
**Advantages:**
56+
- Low computational cost
57+
- Historically proven performance in the context of accelerator physics
58+
- Can account for measurement noise via algorithm hyperparameter
59+
- Can control scaling of step size
60+
61+
**Disadvantages:**
62+
- Local optimizer, sensitive to initial starting conditions
63+
- Scales poorly to higher dimensional problems
64+
- Cannot handle observational constraints

docs/guides/color-by-environment.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
---
2-
sidebar_position: 4
2+
sidebar_position: 5
33
---
44

55
# Color by Environment

docs/guides/create-environments-and-interfaces.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
---
2-
sidebar_position: 5
2+
sidebar_position: 6
33
---
44

55
# Create Environments and Interfaces

docs/guides/gui-usage.md

Lines changed: 26 additions & 76 deletions
Original file line numberDiff line numberDiff line change
@@ -16,36 +16,49 @@ The **Environment** defines available variables and observables for a specific m
1616

1717
Within an environment, an optimization problem can be defined by selecting which variables to adjust, objectives to optimize, and any constraints to follow. **VOCS** represents the subset of variables, objectives, and constraints to be optimized within the environment. You can also add observables within the VOCS section, which the GUI will monitor and display but won’t otherwise interact with. The “Constraints” and “Observables” sections are optional for defining an optimization and are collapsed by default. They can be accessed by clicking on **More** at the bottom of the Environment + VOCS tab.
1818

19-
### Algorithm
19+
### Loading a Template
2020

21-
![Badger GUI algorithm panel](/img/gui/highlight_algorithm.png)
21+
![Badger GUI load template button](/img/gui/highlight_load_template.png)
2222

23-
The **Algorithm** section lets you select an algorithm to use for optimization (1), as well as set the parameters of the selected algorithm (2). See “*Overview of Different Optimization Algorithms*” for a more detailed overview of different options. Common algorithms used at SLAC are expected improvement and nelder-mead.
23+
If there is already a template for the optimization you’d like to run, click the **Load Template** button at the upper left of the **Environment + VOCS** tab, and select the appropriate template. Make sure to check the environment parameters, variables and variable ranges, objectives, constraints/observables, and selected algorithm before running the optimization. See [the templates page](templates) for more information about templates.
2424

25-
### Metadata
25+
### Run Buttons
2626

27-
![Badger GUI metadata panel](/img/gui/highlight_metadata.png)
27+
![Badger GUI action buttons](/img/gui/highlight_bottom_buttons.png)
2828

29-
**Metadata** includes a name (1) and description (2) for the optimization routine. Beneath the description there is also a button to save the current run configuration as a template.
29+
1. Deletes the stored run data from the History Navigator and on disk.
30+
2. Save the current run's log to the configured logbook directory.
31+
3. Resets all variables to their values at the beginning of the run.
32+
4. Pause or resume the active run.
33+
5. Start or end a run.
34+
6. Jump to the optimal combination of variable values in the Plot Area.
35+
7. Set devices to the selected values.
36+
8. Open extension windows such as BOVisualizer and ParetoFrontViewer.
3037

31-
### Loading a Template
38+
### Plot Area and Run Data
3239

33-
![Badger GUI load template button](/img/gui/highlight_load_template.png)
40+
![Badger GUI plot area and run data panel](/img/gui/highlight_plot_area_run_data.png)
3441

35-
If there is already a template for the optimization you’d like to run, click the **Load Template** button at the upper left of the **Environment + VOCS** tab, and select the appropriate template. Make sure to check the environment parameters, variables and variable ranges, objectives, constraints/observables, and selected algorithm before running the optimization. See [the templates page](templates) for more information about templates.
42+
1. **Plot Area** is where run data is visualized as a line graph.
43+
2. **Run Data** holds the raw data points which are fed into the plot.
3644

3745
### History Navigator
3846

3947
![Badger GUI history navigator panel](/img/gui/highlight_history_navigator.png)
4048

4149
The History Navigator holds past runs, whose output can be loaded again with a single click on a given yaml file entry. Past runs are hierarchically organized by year, year and month, and year, month, and day, just like how they are organized in the Badger archive directory.
4250

43-
### Plot Area and Run Data
51+
### Algorithm
4452

45-
![Badger GUI plot area and run data panel](/img/gui/highlight_plot_area_run_data.png)
53+
![Badger GUI algorithm panel](/img/gui/highlight_algorithm.png)
4654

47-
1. **Plot Area** is where run data is visualized as a line graph.
48-
2. **Run Data** holds the raw data points which are fed into the plot.
55+
The **Algorithm** section lets you select an algorithm to use for optimization (1), as well as set the parameters of the selected algorithm (2). See “*Overview of Different Optimization Algorithms*” for a more detailed overview of different options. Common algorithms used at SLAC are expected improvement and nelder-mead.
56+
57+
### Metadata
58+
59+
![Badger GUI metadata panel](/img/gui/highlight_metadata.png)
60+
61+
**Metadata** includes a name (1) and description (2) for the optimization routine. Beneath the description there is also a button to save the current run configuration as a template.
4962

5063
---
5164

@@ -95,66 +108,3 @@ Pressing the extensions button (8) will allow opening extension windows such as
95108
Pressing the delete button (1) will delete the stored run data from the History Navigator panel and on disk. Pressing the log button (2) will save the current run's log to the configured logbook directory.
96109

97110
While the optimization is running, the values of the variables, objectives, and (if selected) constraints and observables will be plotted in the plot section in the top right corner of the GUI. By default, the X-Axis displays the number of optimization iterations, and the Y-Axis for the variables plot is relative to each variable’s starting value. These options can be changed from the GUI via options in the top right corner, above the plots.
98-
99-
---
100-
101-
## Overview of Different Optimization Algorithms
102-
103-
### Nelder-Mead
104-
105-
Iterative downhill simplex algorithm which seeks to find local optima by sampling initial points and then using a heuristic to choose the next point during each iteration. Nelder-Mead has been widely used inside accelerator physics.
106-
107-
**Advantages:**
108-
- Low computational cost
109-
- Historically proven performance in the context of accelerator physics
110-
- Automatic/adaptive hyperparameter specification depending on problem characteristics
111-
112-
**Disadvantages:**
113-
- Local optimizer – sensitive to initial starting conditions
114-
- Sensitive to measurement noise which can negatively impact convergence to optimum
115-
- Scales poorly to higher dimensional problems
116-
- Cannot handle observational constraints
117-
118-
### Extremum Seeking
119-
120-
Perform small oscillations to measurement to slowly move towards minimum. This algorithm uses a sinusoidal sampling strategy for each parameter to slowly drift towards optimal operating conditions and track time dependent changes in the optimal operating conditions over time. It’s useful for time dependent optimization, where short term drifts in accelerator conditions can lead to a time dependent objective function.
121-
122-
**Advantages:**
123-
- Low computational cost
124-
- Can track time-dependent drifts of the objective function to maintain an optimal operating configuration
125-
126-
**Disadvantages:**
127-
- Local optimizer, sensitive to initial starting conditions
128-
- Additional hyperparameters that must be tuned to a given optimization problem
129-
- Scales poorly to higher dimensional problems
130-
- Cannot handle observational constraints
131-
132-
### Expected Improvement (Bayesian Optimization)
133-
134-
Bayesian Optimization (BO) algorithms are machine learning-based algorithms that are particularly well suited to efficiently optimizing noisy objectives with few iterations. Using data collected during and/or prior to optimization, BO algorithms use Bayesian statistics to build a model of the objective function that predicts a distribution of possible function values at each point in parameter space. It then uses an acquisition function to make sampling decisions based on determining the global optimum of the objective function.
135-
136-
**Advantages:**
137-
- Global or local optimization depending on algorithm specifications
138-
- Creates an online surrogate model of the objective and any constraint functions, which can be used during or after optimization
139-
- Can account for observational constraints
140-
- Can incorporate rich prior information about the optimization problem to improve convergence
141-
- Explicitly handles measurement uncertainty and/or noisy objectives
142-
143-
**Disadvantages:**
144-
- Potentially significant computational costs, especially after many iterations
145-
- Numerous hyperparameters which can affect performance
146-
147-
### RCDS
148-
149-
Robust Conjugate Direction Search makes decisions via successive local approximations of the objective function to converge to an optimum. RCDS may be more efficient than Nelder-Mead but requires multiple iterations initially to establish a local model of the objective function before starting to optimize.
150-
151-
**Advantages:**
152-
- Low computational cost
153-
- Historically proven performance in the context of accelerator physics
154-
- Can account for measurement noise via algorithm hyperparameter
155-
- Can control scaling of step size
156-
157-
**Disadvantages:**
158-
- Local optimizer, sensitive to initial starting conditions
159-
- Scales poorly to higher dimensional problems
160-
- Cannot handle observational constraints

static/img/gui/run_1.png

-11.6 KB
Loading
Lines changed: 64 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,64 @@
1+
---
2+
sidebar_position: 4
3+
---
4+
5+
# Optimization Algorithms Overview
6+
7+
## Nelder-Mead
8+
9+
Iterative downhill simplex algorithm which seeks to find local optima by sampling initial points and then using a heuristic to choose the next point during each iteration. Nelder-Mead has been widely used inside accelerator physics.
10+
11+
**Advantages:**
12+
- Low computational cost
13+
- Historically proven performance in the context of accelerator physics
14+
- Automatic/adaptive hyperparameter specification depending on problem characteristics
15+
16+
**Disadvantages:**
17+
- Local optimizer – sensitive to initial starting conditions
18+
- Sensitive to measurement noise which can negatively impact convergence to optimum
19+
- Scales poorly to higher dimensional problems
20+
- Cannot handle observational constraints
21+
22+
## Extremum Seeking
23+
24+
Perform small oscillations to measurement to slowly move towards minimum. This algorithm uses a sinusoidal sampling strategy for each parameter to slowly drift towards optimal operating conditions and track time dependent changes in the optimal operating conditions over time. It’s useful for time dependent optimization, where short term drifts in accelerator conditions can lead to a time dependent objective function.
25+
26+
**Advantages:**
27+
- Low computational cost
28+
- Can track time-dependent drifts of the objective function to maintain an optimal operating configuration
29+
30+
**Disadvantages:**
31+
- Local optimizer, sensitive to initial starting conditions
32+
- Additional hyperparameters that must be tuned to a given optimization problem
33+
- Scales poorly to higher dimensional problems
34+
- Cannot handle observational constraints
35+
36+
## Expected Improvement (Bayesian Optimization)
37+
38+
Bayesian Optimization (BO) algorithms are machine learning-based algorithms that are particularly well suited to efficiently optimizing noisy objectives with few iterations. Using data collected during and/or prior to optimization, BO algorithms use Bayesian statistics to build a model of the objective function that predicts a distribution of possible function values at each point in parameter space. It then uses an acquisition function to make sampling decisions based on determining the global optimum of the objective function.
39+
40+
**Advantages:**
41+
- Global or local optimization depending on algorithm specifications
42+
- Creates an online surrogate model of the objective and any constraint functions, which can be used during or after optimization
43+
- Can account for observational constraints
44+
- Can incorporate rich prior information about the optimization problem to improve convergence
45+
- Explicitly handles measurement uncertainty and/or noisy objectives
46+
47+
**Disadvantages:**
48+
- Potentially significant computational costs, especially after many iterations
49+
- Numerous hyperparameters which can affect performance
50+
51+
## RCDS
52+
53+
Robust Conjugate Direction Search makes decisions via successive local approximations of the objective function to converge to an optimum. RCDS may be more efficient than Nelder-Mead but requires multiple iterations initially to establish a local model of the objective function before starting to optimize.
54+
55+
**Advantages:**
56+
- Low computational cost
57+
- Historically proven performance in the context of accelerator physics
58+
- Can account for measurement noise via algorithm hyperparameter
59+
- Can control scaling of step size
60+
61+
**Disadvantages:**
62+
- Local optimizer, sensitive to initial starting conditions
63+
- Scales poorly to higher dimensional problems
64+
- Cannot handle observational constraints

versioned_docs/version-1.4/guides/color-by-environment.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
---
2-
sidebar_position: 4
2+
sidebar_position: 5
33
---
44

55
# Color by Environment

versioned_docs/version-1.4/guides/create-environments-and-interfaces.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
---
2-
sidebar_position: 5
2+
sidebar_position: 6
33
---
44

55
# Create Environments and Interfaces

0 commit comments

Comments
 (0)