Jesper Nielsen
Jesper Nielsen
Gaussian processes are a probabilistic framework. We will always make predictions as a mean, and an uncertainty around that mean. Generally `predict_f` and `predict_y` are the correct methods to use....
I cannot copy n' paste from your document, so I cannot run your code, so that makes it very hard for me to debug. Also, your document sometimes cuts stuff...
Try something like this: https://colab.research.google.com/drive/1IkZgY7q-t52WuNWOzSFCofnUjE5kvLbs?usp=sharing By playing with the strength of the prior, or disabling training of the noise variance completely you can certainly make the model "go through" the...
> Do you have an update on this PR? It would be really helpful for some of my projects.. Don't hold your breath. I do work on it on and...
Can I ask: Why do we think there would be any performance impact of changing a type definition?
Oh, it's not actually a type annotation, is it? I'm just used to `FooLike` being a `Union` of stuff that can (easily) be converted to a `Foo`. Like [PathLike](https://docs.python.org/3.7/library/os.html?highlight=pathlike#os.PathLike).
I'm sorry, but I have no experience with TFServing whatsoever so I don't know how I'd debug this. However GPflow is a Open Source project, and if you send me...
I'm not sure what you mean by "symbols for shape components". I understand if this is a lot of work, and outside the scope of `einops`, but it is definitely...
In my case the `value` really is discrete, but I have a continuous `count`. What I really want is something that looks like a `Poisson` distribution, but with a tunable...
Also, I would like to point out that `NegativeBinomial.prob` does have gradients, so `cdf` not having them is a bit surprising: ```python count = tf.compat.v1.get_variable("count", shape=()) logit = tf.compat.v1.get_variable("logit", shape=())...