A 'Gaussian Process Playground' for tinkering with sparse Gaussian process regression implementations in PyTorch.
The following figure depicts the optimization of inducing points within the variational free energy sparse Gaussian process framework, made simple with the automatic differentiation engine of PyTorch. A basic example is provided in the script example.py
.
- Quinonero-Candela, J. and Rasmussen, C.E., 2005. A unifying view of sparse approximate Gaussian process regression. The Journal of Machine Learning Research, 6, pp.1939-1959.
- Titsias, M., 2009. Variational learning of inducing variables in sparse Gaussian processes. Proceedings of Machine Learning Research, pp. 567-574.
- Bauer, M., van der Wilk, M. and Rasmussen, C.E., 2016. Understanding probabilistic sparse Gaussian process approximations. Advances in neural information processing systems, 29.
- Foster, L., Waagen, A., Aijaz, N., Hurley, M., Luis, A., Rinsky, J., Satyavolu, C., Way, M.J., Gazis, P. and Srivastava, A., 2009. Stable and Efficient Gaussian Process Calculations. Journal of Machine Learning Research, 10(4).
The modified LBFGS PyTorch optimizer of Yatawatta, S., Spreeuw, H. and Diblen, F. was used.
Yatawatta, S., Spreeuw, H. and Diblen, F., 2018, October. Improving LBFGS optimizer in PyTorch: Knowledge transfer from radio interferometric calibration to machine learning. In 2018 IEEE 14th International Conference on e-Science (e-Science) (pp. 386-387). IEEE.