Skip to content

Releases: SimonBlanke/Hyperactive

v4.8.0

14 Aug 15:06
Compare
Choose a tag to compare
  • add support for numpy v2
  • add support for pandas v2
  • add support for python 3.12
  • transfer setup.py to pyproject.toml
  • change project structure to src-layout

v4.7.0

29 Jul 13:03
Compare
Choose a tag to compare
  • add Genetic algorithm optimizer
  • add Differential evolution optimizer

v4.6.0

01 Nov 10:40
Compare
Choose a tag to compare

add support for constrained optimization

v4.5.0

27 Aug 10:59
Compare
Choose a tag to compare
  • add early stopping feature to custom optimization strategies
  • display additional outputs from objective-function in results in command-line
  • add type hints to hyperactive-api
  • add tests for new features
  • add test for verbosity=False

v4.4.0

01 Mar 11:44
Compare
Choose a tag to compare
  • add new feature: "optimization strategies"
  • redesign progress-bar

v4.3.0

18 Nov 15:49
Compare
Choose a tag to compare
  • add new features from GFO
    • add Spiral Optimization
    • add Lipschitz Optimizer
    • add DIRECT Optimizer
    • print the random seed for reproducibility

v4.0.0

01 Dec 14:54
Compare
Choose a tag to compare

v4.0.0

v3.2.4

07 Jul 12:40
Compare
Choose a tag to compare

Changes from v3.0.0 -> v3.2.4:

  • Decouple number of runs from active processes (Thanks to PartiallyTyped). This reduces memory load if number of jobs is huge
  • New feature: The progress board enables the user to monitor the optimization progress during the run.
    • Display trend of best score
    • Plot parameters and score in parallel coordinates
    • Generate filter file to define an upper and/or lower bound for all parameters and the score in the parallel coordinate plot
    • List parameters of 5 best scores
  • add Python 3.8 to tests
  • add warnings of search space values does not contain lists
  • improve stability of result-methods
  • add tests for hyperactive-memory + search spaces

v2.3.0

16 Jul 10:23
Compare
Choose a tag to compare
  • add Tree-structured optimization algorithm (idea from Hyperopt)
  • add Decision-tree optimization algorithm (idea from sklearn)
  • enable new optimization parameters for bayes-opt:
    • max_sample_size: maximum number of samples for the gaussian-process-reg to train on. Sampling done by random choice.
    • skip_retrain: skips the retraining of the gaussian-process-reg sometimes during the optimization run. Basically returns multiple predictions for next output (which should be apart from another)

v2.1.0

16 Jul 10:15
Compare
Choose a tag to compare
  • first stable implementation of "long-term-memory" to save/load search positions/parameter and results.
  • enable warm start of sequence based optimizers (bayesian opt, ...) with results from "long-term-memory"
  • enable the usage of other gaussian-process-regressors than from sklearn. GPR-class (from gpy, GPflow, ...) can be passed to "optimizer"-kwarg