#On the Quantitative Analysis of Decoder-Based Models
Dependencies: Theano, Lasagne
##Evaluate the models:
To evaluate a decoder-based generative model:
- You need to provide your decoder
gen
in the form of a python function, which takes in a theano tensor variable Z (latent) and outputs a theano tensor variable X (the sample) (an example can be found in line 84-85 in ./lib/utils.py),
def gen(Z):
...
return X
and modify the model loading procedure in load_model
function at ./lib/utils.py.
-
Modify the data loading procedure at ./lib/utils.py (e.g. load_mnist), including validation/training split. Or for loading another dataset.
-
If you're evaluating your model on a dataset other than MNIST, you need to modify the data dimension in ./sampling/sampler.py, at line 19 and 20, and modify the procedure of calculating data likelihood at line 113 and 117. An example can be found at ./sampling/svhn_sampler.py
-
Various command instruction can be found at comment after the command.
-
Run function
main(*args)
in ./experiment/run.py
##Visualize posterior samples:
mkdir ./vis
- Run function
main(*args)
in ./experiment/run.py, withplot_posterior
set to 1. - For visualizing the posterior samples of a particular digit X, set exps to "postXtra" for digit in training set and "postXval" for digit in validation set.
See an tensorflow implementation: https://github.com/jiamings/ais