Feature Request: Guidelines on implementing Sparse GP in Numpyro
Hi!
I would like to request some guidelines on how to implement SparseGP (with SVI) in Numpyro as already included in Pyro. Currently there is only a simple Gaussian Process implementation in Numpyro .
Thank you very much in advance :)
The Pyro module pyro.contrib.gp is pretty stable and there are already a couple of examples (deep kernel learning, gplvm) of doing SVI inference for sparse GP models in Pyro. I would like to extend the feature request for having an example of deep sparse GP in NumPyro, which isn't duplicated with the current examples on Pyro and illustrates how to use flax_module primitive. Please see the following references as a guideline for an implementation:
- GP conditional: This utility implements the core math of GP and is used across all GP models.
- RBF kernel: a popular GP kernel
- Variational Sparse Gaussian Process model: the most Pyro-friendly implementation of sparse GP.
- See flax_module and random_flax_module for how to use the neural network library Flax in numpyro.
- Some Pyro examples: deep kernel learning, gplvm.
- The deep GP tutorial that I wrote 2 years ago: when you have sparse GP implemented, extending it to deep gp or deep kernel gp is pretty straightforward.
Hi @LysSanzMoreta, I have an example of how one could do SparseGPs (the classic variational free energy version).
You can find the demo script here as well as the jupyter notebook here (colab-able). I tried to faithfully copy the pyro implementation. I hope it helps someone!
I'm currently working on the Stochastic Variational GP implementation (pyro) right now. But it's a bit more involved so it will take me some time.
@jejjohnson Thank you very much! I also made a little implementation in pyro ( I switch back and forth heheh), your example will be very useful to check if I am on the right track :)