Repository consists of:
- Notebook (
mab_problem/notebook/multi_armed_bandit.ipynb
) with explanations on how to deal with multi-armed problems through four different approaches:
- Random Selection
- Epsilon Greedy
- Thompson Sampling
- Upper Confidence Bound (UCB1)
Should be opened by Jupyter NBViewer in order to see the plots.
- Flask app (
mab_problem/flask_app
) for interactive experience with 2 variants and 1000 trials.
To run an app on your machine clone/download the repo and follow the commands:
$ cd mab_problem/flask_app
$ export FLASK_APP=app.py
$ flask run