Skip to content

Commit

Permalink
Add elaborate docs (#5)
Browse files Browse the repository at this point in the history
  • Loading branch information
jbussemaker committed Jul 5, 2023
1 parent 7670415 commit fe59815
Show file tree
Hide file tree
Showing 28 changed files with 381 additions and 108 deletions.
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -213,3 +213,4 @@ $RECYCLE.BIN/
# End of https://www.toptal.com/developers/gitignore/api/windows,pycharm,python,jupyternotebooks

n2.html
site/
13 changes: 13 additions & 0 deletions .readthedocs.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
version: 2

build:
os: ubuntu-22.04
tools:
python: "3.10"

mkdocs:
configuration: mkdocs.yml

python:
install:
- requirements: requirements-docs.txt
33 changes: 12 additions & 21 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
[![status](https://joss.theoj.org/papers/0b2b765c04d31a4cead77140f82ecba0/status.svg)](https://joss.theoj.org/papers/0b2b765c04d31a4cead77140f82ecba0)

[GitHub Repository](https://github.com/jbussemaker/SBArchOpt) |
[Documentation](https://github.com/jbussemaker/SBArchOpt/blob/main/docs/readme.md)
[Documentation](https://sb-arch-opt.readthedocs.io/)

SBArchOpt (es-bee-ARK-opt) provides a set of classes and interfaces for applying Surrogate-Based Optimization (SBO)
for system architecture optimization problems:
Expand Down Expand Up @@ -47,29 +47,10 @@ pip install sb-arch-opt
Note: there are optional dependencies for the connected optimization frameworks and test problems.
Refer to their documentation for dedicated installation instructions.

### Quick Start

Have a look at the [tutorial notebook](https://github.com/jbussemaker/SBArchOpt/blob/main/docs/tutorial.ipynb) for a simple example.

## Documentation

Refer to the [documentation](https://github.com/jbussemaker/SBArchOpt/blob/main/docs/readme.md) for more background on SBArchOpt
Refer to the [documentation](https://sb-arch-opt.readthedocs.io/) for more background on SBArchOpt
and how to implement architecture optimization problems.
Test problem documentation can be found here: [test problems](https://github.com/jbussemaker/SBArchOpt/blob/main/docs/test_problems.md)

Optimization framework documentation:
- [pymoo](https://github.com/jbussemaker/SBArchOpt/blob/main/docs/algo_pymoo.md)
- [ArchSBO](https://github.com/jbussemaker/SBArchOpt/blob/main/docs/algo_arch_sbo.md)
- [BoTorch (Ax)](https://github.com/jbussemaker/SBArchOpt/blob/main/docs/algo_botorch.md)
- [Trieste](https://github.com/jbussemaker/SBArchOpt/blob/main/docs/algo_trieste.md)
- [HEBO](https://github.com/jbussemaker/SBArchOpt/blob/main/docs/algo_hebo.md)
- [TPE](https://github.com/jbussemaker/SBArchOpt/blob/main/docs/algo_tpe.md)
- [SEGOMOE](https://github.com/jbussemaker/SBArchOpt/blob/main/docs/algo_segomoe.md)
- [SMARTy](https://github.com/jbussemaker/SBArchOpt/blob/main/docs/algo_smarty.md)

See also the tutorials:
- [SBArchOpt Tutorial](https://github.com/jbussemaker/SBArchOpt/blob/main/docs/tutorial.ipynb): optimization, implementing new problems
- [Tunable Hierarchical Meta Problem Tutorial](https://github.com/jbussemaker/SBArchOpt/blob/main/docs/tutorial_tunable_meta_problem.ipynb)

## Contributing

Expand All @@ -86,3 +67,13 @@ Contributions are appreciated too:
- Read and sign the [Contributor License Agreement (CLA)](https://github.com/jbussemaker/SBArchOpt/blob/main/SBArchOpt%20DLR%20Individual%20Contributor%20License%20Agreement.docx)
, and send it to the project coordinator
- Issue a pull request

### Adding Documentation

```
pip install -r requirements-docs.txt
mkdocs serve
```

Refer to [mkdocs](https://www.mkdocs.org/) and [mkdocstrings](https://mkdocstrings.github.io/) documentation
for more information.
10 changes: 7 additions & 3 deletions docs/algo_arch_sbo.md → docs/algo/arch_sbo.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Architecture Surrogate-Based Optimization (SBO) Algorithm

`arch_sbo` implements a Surrogate-Based Optimization (SBO) algorithm configured for solving most types of architecture
ArchSBO implements a Surrogate-Based Optimization (SBO) algorithm configured for solving most types of architecture
optimization problems. It has been developed with experience from the following work:

J.H. Bussemaker et al., "Effectiveness of Surrogate-Based Optimization Algorithms for System Architecture Optimization",
Expand All @@ -23,6 +23,8 @@ pip install sb-arch-opt[arch_sbo]

## Usage

[API Reference](../api/arch_sbo.md)

The algorithm is implemented as a [pymoo](https://pymoo.org/) algorithm and already includes all relevant architecture
optimization measures. It can be used directly with pymoo's interface:

Expand All @@ -35,11 +37,13 @@ problem = ... # Subclass of ArchOptProblemBase
# Get Kriging or RBF algorithm
n_init = 100
results_folder_path = 'path/to/results/folder'
gp_arch_sbo_algo = get_arch_sbo_gp(problem, init_size=n_init, results_folder=results_folder_path)
gp_arch_sbo_algo = get_arch_sbo_gp(problem, init_size=n_init,
results_folder=results_folder_path)

# Start from previous results (skipped if no previous results are available)
gp_arch_sbo_algo.initialize_from_previous_results(problem, results_folder_path)

n_infill = 10
result = minimize(problem, gp_arch_sbo_algo, termination=('n_eval', n_init + n_infill))
result = minimize(problem, gp_arch_sbo_algo,
termination=('n_eval', n_init + n_infill))
```
4 changes: 3 additions & 1 deletion docs/algo_botorch.md → docs/algo/botorch.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
![BoTorch Logo](https://github.com/pytorch/botorch/raw/main/botorch_logo_lockup.png)

# SBArchOpt Interface to BoTorch: Bayesian Optimization with PyTorch
# BoTorch: Bayesian Optimization with PyTorch

[BoTorch](https://botorch.org/) is a Bayesian optimization framework written on top of the [PyTorch](https://pytorch.org/)
machine learning library. More information:
Expand All @@ -17,6 +17,8 @@ pip install sb-arch-opt[botorch]

## Usage

[API Reference](../api/botorch.md)

The `get_botorch_interface` function can be used to get an interface object that can be used to create an
`OptimizationLoop` instance, with correctly configured search space, optimization configuration, and evaluation
function.
Expand Down
4 changes: 3 additions & 1 deletion docs/algo_hebo.md → docs/algo/hebo.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
![BoTorch Logo](https://hebo.readthedocs.io/en/latest/_static/hebo.png)

# SBArchOpt Interface to HEBO: Heteroscedastic Evolutionary Bayesian Optimization
# HEBO: Heteroscedastic Evolutionary Bayesian Optimization

[HEBO](https://hebo.readthedocs.io/en/) is a Bayesian optimization algorithm developed by Huawei Noah's Ark lab.
It supports mixed-discrete parameter and several types of underlying probabilistic models.
Expand All @@ -17,6 +17,8 @@ pip install -r requirements-hebo.txt

## Usage

[API Reference](../api/hebo.md)

The `get_hebo_optimizer` function can be used to get an interface object for running the optimization.
The `hebo` object also has an ask-tell interface if needed.

Expand Down
10 changes: 7 additions & 3 deletions docs/algo_pymoo.md → docs/algo/pymoo.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
![pymoo Logo](https://github.com/anyoptimization/pymoo-data/blob/main/logo.png?raw=true)

# SBArchOpt Interface to pymoo
# pymoo

[pymoo](https://pymoo.org/) is a multi-objective optimization framework that supports mixed-discrete problem
definitions. It includes many test problems and algorithms, mostly evolutionary algorithms such as a Genetic Algorithm
Expand All @@ -14,6 +14,8 @@ No further actions required.

## Usage

[API Reference](../api/pymoo.md)

Since the problem definition is based on pymoo, pymoo algorithms work out-of-the-box. However, their effectiveness can
be improved by provisioning them with architecture optimization repair operators and repaired sampling strategies.

Expand Down Expand Up @@ -52,7 +54,8 @@ be recovered.
```python
from pymoo.optimize import minimize
from pymoo.algorithms.soo.nonconvex.ga import GA
from sb_arch_opt.algo.pymoo_interface import provision_pymoo, initialize_from_previous_results
from sb_arch_opt.algo.pymoo_interface import provision_pymoo, \
initialize_from_previous_results

problem = ... # Subclass of ArchOptProblemBase

Expand All @@ -71,7 +74,8 @@ result = minimize(problem, ga_algorithm, termination=('n_gen', 10))
For running large DOE's with intermediate results storage, you can use `get_doe_algo`:

```python
from sb_arch_opt.algo.pymoo_interface import get_doe_algo, load_from_previous_results
from sb_arch_opt.algo.pymoo_interface import get_doe_algo, \
load_from_previous_results

problem = ... # Subclass of ArchOptProblemBase
results_folder_path = 'path/to/results/folder'
Expand Down
18 changes: 11 additions & 7 deletions docs/algo_segomoe.md → docs/algo/segomoe.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# SBArchOpt Interface to SEGOMOE: Super Efficient Global Optimization with Mixture of Experts
# SEGOMOE: Super Efficient Global Optimization with Mixture of Experts

SEGOMOE is a Bayesian optimization toolbox developed by ONERA and ISAE-SUPAERO. For more information refer to:

Expand All @@ -19,6 +19,8 @@ SEGOMOE is not openly available.

## Usage

[API Reference](../api/segomoe.md)

SEGOMOE is interacted with through the `SEGOMOEInterface` class. This class has a state containing evaluated (and
failed) points, and requires a directory for results storage. The `run_optimization` function can be used to
run the DOE and infill search.
Expand All @@ -31,17 +33,19 @@ problem = ... # Subclass of ArchOptProblemBase
# Define folder to store results in
results_folder = ...

# Use Mixture of Experts: automatically identifies clusters in the design space with different best surrogates
# ("experts"). Can be more accurate, however also greatly increases the cost of finding new infill points.
# Use Mixture of Experts: automatically identifies clusters in the design space
# with different best surrogates ("experts"). Can be more accurate, however
# also greatly increases the cost of finding new infill points.
use_moe = True

# Options passed to the Sego class and to model generation, respectively
sego_options = {}
model_options = {}

# Get the interface (automatically initialized if the results folder already contains results)
interface = SEGOMOEInterface(problem, results_folder, n_init=100, n_infill=50, use_moe=use_moe,
sego_options=sego_options, model_options=model_options)
# Get the interface (will be initialized if the results folder has results)
interface = SEGOMOEInterface(problem, results_folder, n_init=100, n_infill=50,
use_moe=use_moe, sego_options=sego_options,
model_options=model_options)

# Initialize from other results if you want
interface.initialize_from_previous('path/to/other/results_folder')
Expand All @@ -54,5 +58,5 @@ x_failed = interface.x_failed # (n_failed, nx)
f = interface.f # (n, nf)
g = interface.g # (n, ng)
pop = interface.pop # Population containing all design points
opt = interface.opt # Population containing optimal points (Pareto front if multi-objective)
opt = interface.opt # Population containing optimal point(s)
```
6 changes: 4 additions & 2 deletions docs/algo_smarty.md → docs/algo/smarty.md
Original file line number Diff line number Diff line change
@@ -1,17 +1,19 @@
# SBArchOpt Interface to SMARTy: Surrogate Modeling for Aero-Data Toolbox
# SMARTy: Surrogate Modeling for Aero-Data Toolbox

SMARTy is a surrogate modeling toolbox with optimization capabilities developed by the DLR. For more information refer to:

Bekemeyer, P., Bertram, A., Hines Chaves, D.A., Dias Ribeiro, M., Garbo, A., Kiener, A., Sabater, C., Stradtner, M.,
Wassing, S., Widhalm, M. and Goertz, S., 2022. Data-Driven Aerodynamic Modeling Using the DLR SMARTy Toolbox.
In AIAA Aviation 2022 Forum (p. 3899). https://arc.aiaa.org/doi/abs/10.2514/6.2022-3899
In AIAA Aviation 2022 Forum (p. 3899). DOI: [10.2514/6.2022-3899](https://arc.aiaa.org/doi/abs/10.2514/6.2022-3899)

## Installation

SMARTy is not openly available.

## Usage

[API Reference](../api/smarty.md)

The `get_smarty_optimizer` function can be used to get an interface object for running the optimization.

```python
Expand Down
6 changes: 3 additions & 3 deletions docs/algo_tpe.md → docs/algo/tpe.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,8 @@ A TPE inverts the typical prediction process of a surrogate model: it models x f
very complicated design spaces structures, making it appropriate for architecture optimization and hyperparameter
optimization where it was first developed. For more details, refer to:

Bergstra et al., "Algorithms for Hyper-Parameter Optimization", 2011, available at:
https://papers.nips.cc/paper/2011/file/86e8f7ab32cfd12577bc2619bc635690-Paper.pdf
Bergstra et al., "Algorithms for Hyper-Parameter Optimization", 2011, available
[here](https://papers.nips.cc/paper/2011/file/86e8f7ab32cfd12577bc2619bc635690-Paper.pdf).

We use the implementation found [here](https://github.com/nabenabe0928/tpe), which currently supports single-objective
unconstrained optimization problems.
Expand Down Expand Up @@ -36,7 +36,7 @@ algo = TPEAlgorithm(n_init=n_init, results_folder=results_folder_path)

# Start from previous results (skipped if no previous results are available)
if initialize_from_previous_results(algo, problem, results_folder_path):
# No need to evaluate any initial points, as they have been previously evaluated
# No need to evaluate any initial points, as they already have been evaluated
n_init = 0

n_infill = 10
Expand Down
4 changes: 3 additions & 1 deletion docs/algo_trieste.md → docs/algo/trieste.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# SBArchOpt Interface to Trieste
# Trieste

[Trieste](https://secondmind-labs.github.io/trieste/1.0.0/index.html) is a Bayesian optimization library built on
[TensorFlow](https://www.tensorflow.org/), Google's machine learning framework. Trieste is an evolution of spearmint.
Expand All @@ -18,6 +18,8 @@ pip install sb-arch-opt[trieste]

## Usage

[API Reference](../api/trieste.md)

The `get_trieste_optimizer` function can be used to get an interface object that can be used to create an
`ArchOptBayesianOptimizer` instance, with correctly configured search space, optimization configuration, evaluation
function, and possibility to deal with and stay away from hidden constraints.
Expand Down
19 changes: 19 additions & 0 deletions docs/api/arch_sbo.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
# ArchSBO API Reference

[Installation and usage](../algo/arch_sbo.md)

::: sb_arch_opt.algo.arch_sbo.api.get_arch_sbo_gp
handler: python

::: sb_arch_opt.algo.arch_sbo.api.get_arch_sbo_rbf
handler: python

::: sb_arch_opt.algo.arch_sbo.algo.InfillAlgorithm
handler: python
options:
members:
- store_intermediate_results
- initialize_from_previous_results

::: sb_arch_opt.algo.arch_sbo.api.get_sbo
handler: python
14 changes: 14 additions & 0 deletions docs/api/botorch.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
# BoTorch (Ax) API Reference

[Installation and usage](../algo/botorch.md)

::: sb_arch_opt.algo.botorch_interface.api.get_botorch_interface
handler: python

::: sb_arch_opt.algo.botorch_interface.algo.AxInterface
handler: python
options:
members:
- get_optimization_loop
- get_search_space
- get_population
15 changes: 15 additions & 0 deletions docs/api/hebo.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
# HEBO API Reference

[Installation and usage](../algo/hebo.md)

::: sb_arch_opt.algo.hebo_interface.get_hebo_optimizer
handler: python

::: sb_arch_opt.algo.hebo_interface.algo.HEBOArchOptInterface
handler: python
options:
members:
- optimize
- ask
- tell
- pop
53 changes: 53 additions & 0 deletions docs/api/problem.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
# Problem Definition and Sampling

::: sb_arch_opt.problem.ArchOptProblemBase
handler: python
options:
heading_level: 2
members:
- design_space
- correct_x
- load_previous_results
- store_results
- evaluate
- vars
- get_categorical_values
- is_conditionally_active
- all_discrete_x
- print_stats
- get_imputation_ratio
- get_discrete_rates
- get_failure_rate
- get_n_declared_discrete
- get_n_valid_discrete
- _arch_evaluate
- _correct_x

::: sb_arch_opt.sampling.HierarchicalSampling
handler: python
options:
heading_level: 2
members:
- sample_get_x

::: sb_arch_opt.design_space.ArchDesignSpace
handler: python
options:
heading_level: 2
members:
- all_discrete_x
- correct_x
- quick_sample_discrete_x
- des_vars
- imputation_ratio
- is_conditionally_active
- is_cat_mask
- is_cont_mask
- is_discrete_mask
- is_int_mask
- xl
- xu
- get_categorical_values
- get_discrete_rates
- get_n_declared_discrete
- get_n_valid_discrete
Loading

0 comments on commit fe59815

Please sign in to comment.