Tracing adjoint gradient
Some tf.function wrapped functions that only have a single tape result in:
LookupError: No gradient defined for operation 'TfqAdjointGradient' (op type: TfqAdjointGradient)
One possible cause is mentioned at this link:
It should be noted
tf.GradientTapeis still watching the forward pass of atf.custom_gradient, and will use the ops it watches. As a consequence, callingtf.functionwhile the tape is still watching leads to a gradient graph being built. If an op is used intf.functionwithout registered gradient, aLookupErrorwill be raised.
So solving this seems like it would involve removing tf.function decoration from the library, leaving it up to users to decorate as desired.
Can you provide a minimal snippet that reproduces this issue ? It looks like this lookup is related to trying to get gradient info from the gradient op which could be caused by trying to do 2nd order or multi-tape things.
Still need to get a more minimal example. Most toggleable place it's turned up is line 144 of the tests here , where I found that adding persistent=True to the gradient tape causes the error LookupError: No gradient defined for operation'TfqAdjointGradient' to happen.