Skip to content

Latest commit

 

History

History
34 lines (24 loc) · 1.56 KB

README.md

File metadata and controls

34 lines (24 loc) · 1.56 KB

Adaptive Divergence for Rapid Adversarial Optimization

This repository contains experiments for Adaptive Divergence for Rapid Adversarial Optimization study.

Installation

This repository uses the following libraries:

Among non-default packages PythiaMill library requires manual installation. Please, follow the instructions in the corresponding repositories.

Other packages are available from the default pip repository and required versions are specified in setup.py.

Experiments

Jupyter notebooks with the experiments described in the paper can be found in notebooks/ directory:

  • AD-<task name>-<method name>.ipynb --- notebooks for profiling adaptive divergences on the synthetic tasks;
  • BO-XOR-GBDT.ipynb --- the experiment with Bayesian Optimization over GBDT-based adaptive divergences on one of the synthetic tasks;
  • BO-PythiaTuneMC-Cat.ipynb --- tuning Pythia hyper-parameters with Bayesian Optimization and CatBoost-based adaptive divergences;
  • plot-AVO.ipynb --- visualization of AVO results.

Code for the experiments involving AVO can be found in experiments/AVO.py.

Note: inside the package adaptive divergences might be referred as 'pseudo-Jensen-Snannon divergences' or 'pJSD'.