tf.math.betainc is missing gradients
I'm trying to fit a NegativeBinomial, but its cdf seems to be missing some gradients:
count = tf.compat.v1.get_variable("count", shape=())
logit = tf.compat.v1.get_variable("logit", shape=())
value = tf.compat.v1.get_variable("value", shape=())
cdf = tfpd.NegativeBinomial(total_count=count, logits=logit).cdf(value)
print(tf.gradients(cdf, [count])) # Prints `None`.
print(tf.gradients(cdf, [logit])) # Works.
print(tf.gradients(cdf, [value])) # Prints `None`.
I think this is working as intended, in that the gradient w.r.t. discrete parameters/values is not defined. Is there a use case you're working with wherein one or both of these discrete quantities is relaxed to be continuous? Maybe we can help find an alternative implementation?
In my case the value really is discrete, but I have a continuous count. What I really want is something that looks like a Poisson distribution, but with a tunable variance.
Also, I would like to point out that NegativeBinomial.prob does have gradients, so cdf not having them is a bit surprising:
count = tf.compat.v1.get_variable("count", shape=())
logit = tf.compat.v1.get_variable("logit", shape=())
value = tf.compat.v1.get_variable("value", shape=())
prob = tfpd.NegativeBinomial(total_count=count, logits=logit).prob(value)
print(tf.gradients(prob, [count])) # Works.
print(tf.gradients(prob, [logit])) # Works.
print(tf.gradients(prob, [value])) # Works.
Ah, looks like the issue is just that betainc doesn't have grads defined w.r.t. its first two arguments:
https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/ops/math_grad.py#L922
I'm sure TF would welcome this upstream contribution, if you were so inclined!
Any news or solution to this? @csuter @jesnie
Hi @jesnie and @nicocheh!
I hope this message finds you well.
Since the last release, tfp.math.betainc has gradients w.r.t. all parameters.
All the best,