Derivatives of kernels
Is your feature request related to a problem? Please describe. For both optimization of parameters and the calculation itself it would be great to have functions that analytically calculate derivatives, sometimes your target values only relate to the derivatives and not the actual kernel of the inputs so it would be nice to have a function for that.
Describe the solution you'd like In FidelityKernel and ProjectedQuantumKernel I'd like to have something like evaluate_derivative, ideally even as a gradient for all the hyperparameters.
Describe alternatives you've considered Performing numerical differentiation with numdifftools but that is quite slow.
Additional context The one point where I expected derivatives to be implemented was for the optimization of hyperparameters, but looking at squlearn.optimizers it seems that numerical differentiation has been implemented. Thanks for your work on this!
Hello, and thank you for your feature request. We have currently a project working on derivatives of the kernel matrices. So I'm optimistic that this is indeed a feature that we can deliver soon.
Great! Looking forward to it.
We implemented a function for evaluating derivatives of the Projected Quantum Kernels (see pull request #277) Currently it is available only in the development version, but there will be soon a new release of version 0.8.
The function is called ProjectedQuantumKernel.evaluate_derivatives and it computes derivatives w.r.t. features or parameters.
Thank you very much for you quick work, as this was more a feature request issue I close it now.