Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
24 changes: 24 additions & 0 deletions documentation/docs/guides/generators/expected_improvement.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
# Expected Improvement

Bayesian Optimization (BO) algorithms are machine learning-based algorithms that are particularly well suited to efficiently optimizing noisy objectives with few iterations. Using data collected during and/or prior to optimization, BO algorithms use Bayesian statistics to build a model of the objective function that predicts a distribution of possible function values at each point in parameter space. It then uses an acquisition function to make sampling decisions based on determining the global optimum of the objective function.
<br />

**Advantages:**
- Global or local optimization depending on algorithm specifications
- Creates an online surrogate model of the objective and any constraint functions, which can be used during or after optimization
- Can account for observational constraints
- Can incorporate rich prior information about the optimization problem to improve convergence
- Explicitly handles measurement uncertainty and/or noisy objectives
<br />

**Disadvantages:**
- Potentially significant computational costs, especially after many iterations
- Numerous hyperparameters which can affect performance
<br />

## Parameters
- `turbo_controller` : Trust-region Bayesian Optimization dynamically constrains the search space to a region around the best point. Options are `null`, `OptimizeTurboController`, or `SafetyTurboController`.
- `numerical_optimizer` : Numerical method for finding the maximum value of the aquisition function at each optimization step.
- `max_travel_distances` : Optional list of maximum step sizes, as floats, for each variable. If provided must be the same length as number of variables. Each distance will be applied as an additional constraint on the bounds for each optimization step. For example, if a max_travel_distance of [1.0] is given for a magnet, each step of the optimization will be constrained to a distance of +- 1.0kG from the current value.
<br />
<br />
23 changes: 23 additions & 0 deletions documentation/docs/guides/generators/extremum_seeking.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
## Extremum Seeking

Perform small oscillations to measurement to slowly move towards minimum. This algorithm uses a sinusoidal sampling strategy for each parameter to slowly drift towards optimal operating conditions and track time dependent changes in the optimal operating conditions over time. It’s useful for time dependent optimization, where short term drifts in accelerator conditions can lead to a time dependent objective function.
<br />

**Advantages:**
- Low computational cost
- Can track time-dependent drifts of the objective function to maintain an optimal operating configuration
<br />

**Disadvantages:**
- Local optimizer, sensitive to initial starting conditions
- Additional hyperparameters that must be tuned to a given optimization problem
- Scales poorly to higher dimensional problems
- Cannot handle observational constraints
<br />

## Parameters
- `k` : Feedback gain
- `oscillation_size` : Oscillation size
- `decay_rate` : Decay rate
<br />
<br />
22 changes: 22 additions & 0 deletions documentation/docs/guides/generators/neldermead.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
## Nelder-Mead

Iterative downhill simplex algorithm which seeks to find local optima by sampling initial points and then using a heuristic to choose the next point during each iteration. Nelder-Mead has been widely used inside accelerator physics.
<br />

**Advantages:**
- Low computational cost
- Historically proven performance in the context of accelerator physics
- Automatic/adaptive hyperparameter specification depending on problem characteristics
<br />

**Disadvantages:**
- Local optimizer – sensitive to initial starting conditions
- Sensitive to measurement noise which can negatively impact convergence to optimum
- Scales poorly to higher dimensional problems
- Cannot handle observational constraints
<br />

## Parameters
- `adaptive` : If `True`, dynamically adjust internal parameters based on dimensionality.
<br />
<br />
14 changes: 14 additions & 0 deletions documentation/docs/guides/generators/rcds.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
## RCDS

Robust Conjugate Direction Search makes decisions via successive local approximations of the objective function to converge to an optimum. RCDS may be more efficient than Nelder-Mead but requires multiple iterations initially to establish a local model of the objective function before starting to optimize.

**Advantages:**
- Low computational cost
- Historically proven performance in the context of accelerator physics
- Can account for measurement noise via algorithm hyperparameter
- Can control scaling of step size

**Disadvantages:**
- Local optimizer, sensitive to initial starting conditions
- Scales poorly to higher dimensional problems
- Cannot handle observational constraints
105 changes: 88 additions & 17 deletions src/badger/factory.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@
import os
import importlib
import yaml
from pathlib import Path
from xopt.generators import generators, get_generator_defaults

import logging
Expand Down Expand Up @@ -183,38 +184,112 @@ def load_plugin(
return plugin


def load_docs(root, pname, ptype):
assert ptype in [
"generator",
"interface",
"environment",
], f"Invalid plugin type {ptype}"
def get_generator_docs(name: str):
"""
Load and format Badger generator documentation from markdown
files and class docstrings.

Parameters
__________
name : str
Generator name, must also match the markdown filename

Returns
_______
str:
Formatted markdown string containing both the Badger guide content
and the class docstring in a code block.
"""
# .../Badger/src/badger/factory.py
PROJECT_ROOT = Path(__file__).resolve().parent.parent.parent
BADGER_GUIDES_DIR = PROJECT_ROOT / "documentation" / "docs" / "guides"

docs_dir = BADGER_GUIDES_DIR / "generators"

proot = os.path.join(root, f"{ptype}s")
readme = None
docstring = None

try:
try:
with open(docs_dir / f"{name}.md", "r") as f:
readme = f.read()
except FileNotFoundError:
readme = f"# {name}\nNo documentation found.\n"

docstring = generators[name].__doc__

return _format_docs_str(readme, docstring, "generator")
except:
raise BadgerInvalidDocsError(
f"Error loading docs for generator {name}: docs not found"
)


def load_plugin_docs(pname: str, ptype: str) -> str:
"""
Load and format documentation for a Badger plugin. Loads the plugin's
README.md file and extracts the class docstring,
then formats them together as a single markdown document.

Parameters
__________
pname : str
Name of the plugin to load documentation for.
Must match the plugin's directory and module name.
ptype : str
Type of plugin (e.g. 'environment')

Returns
_______
str:
Formatted markdown string containing both the README content
and the plugin class docstring in a code block.
"""
# assert plugin type is a directory in BADGER_PLUGIN_ROOT
p = Path(BADGER_PLUGIN_ROOT)
ptype_dir = p / f"{ptype}s"
assert ptype_dir.is_dir(), f"Invalid plugin type '{ptype}'. Directory not found"

plugin_dir = ptype_dir / pname

# Load the readme and the docs
readme = None
docstring = None

try:
try:
with open(os.path.join(proot, pname, "README.md"), "r") as f:
with open(plugin_dir / "README.md", "r") as f:
readme = f.read()
except:
readme = f"# {pname}\nNo readme found.\n"

module = importlib.import_module(f"{ptype}s.{pname}")
docstring = module.Environment.__doc__

# Format as Markdown code block
help_md = f"```text\n{readme}\n# Environment Documentation\n{docstring}\n```"
return help_md
if ptype == "environment":
docstring = module.Environment.__doc__

return _format_docs_str(readme, docstring, ptype)
except:
raise BadgerInvalidDocsError(
f"Error loading docs for {ptype} {pname}: docs not found"
)


def _format_docs_str(readme: str, docstring: str, ptype: str) -> str:
if ptype is None and docstring is None:
return readme

if ptype is not None:
ptype = ptype.title()

# Format as Markdown code block
help_md = (
f"\n{readme}\n## Docstrings\nContinue reading to see the full docstring for "
f"the selected Badger {ptype} class\n\n```text\n# {ptype} Documentation\n{docstring}\n```"
)
return help_md


def get_plug(root: str, name: str, ptype: str):
try:
plug = BADGER_FACTORY[ptype][name]
Expand All @@ -239,12 +314,8 @@ def scan_extensions(root):
return extensions


def get_generator_docs(name: str):
return generators[name].__doc__


def get_env_docs(name: str):
return load_docs(BADGER_PLUGIN_ROOT, name, "environment")
return load_plugin_docs(name, "environment")


def get_intf(name: str):
Expand Down
Loading