geepee icon indicating copy to clipboard operation
geepee copied to clipboard

Approximate Power-EP for Deep GPs with intermediate variables for density models/latent variable models

Open thangbui opened this issue 8 years ago • 2 comments

Related to #7, need to figure out a good initialisation scheme

thangbui avatar Mar 26 '17 22:03 thangbui

Hi thangbui, Firstly, thanks a lot for sharing this code, it's great. Does the repo contain code for a Deep GP latent variable model, or is this not fully implemented yet?

wil-j-wil avatar May 10 '18 10:05 wil-j-wil

Hi @wil-j-wil,

Unfortunately, there are still many models missing or not fully implemented. My goal originally was to have all of the following models + inference schemes under one package [the goal is not to have another GPflow or GPy, but something more research-y and more pedagogical]:

  • various GP models: GP regression/classification, deep GPs for regression/classification, GP/deep GP latent variable models, GP state space model, ...
  • inference methods using inducing points for all models above: variational methods (Titsias's style), (Power) EP, approximate Power EP aka Blackbox alpha,
  • different posterior approximations for deep GPs: one with explicit representations for the intermediate hidden variables (as in the original deep GP paper by Damianou and Lawrence) and one with somewhat implicit hidden representations (as in the Nested compression paper by Hensman et al., or in Bui et al (2016) and Salimbeni and Deisenroth (2017)),
  • different propagation techniques for deep and state space architectures: probabilistic backpropagation (moment matching), simple Monte Carlo, linearisation etc., similar to scented and unscented methods in the sys-id literature,
  • by using EP style representations, inference for distributed/online settings should be easily handled.

I've been busy with other projects but will try to get back to this hopefully very soon.

Best, Thang.

thangbui avatar May 21 '18 02:05 thangbui