squlearn icon indicating copy to clipboard operation
squlearn copied to clipboard

Derivatives of kernels

Open yannick-couzinie opened this issue 1 year ago • 2 comments

Is your feature request related to a problem? Please describe. For both optimization of parameters and the calculation itself it would be great to have functions that analytically calculate derivatives, sometimes your target values only relate to the derivatives and not the actual kernel of the inputs so it would be nice to have a function for that.

Describe the solution you'd like In FidelityKernel and ProjectedQuantumKernel I'd like to have something like evaluate_derivative, ideally even as a gradient for all the hyperparameters.

Describe alternatives you've considered Performing numerical differentiation with numdifftools but that is quite slow.

Additional context The one point where I expected derivatives to be implemented was for the optimization of hyperparameters, but looking at squlearn.optimizers it seems that numerical differentiation has been implemented. Thanks for your work on this!

yannick-couzinie avatar Apr 17 '24 03:04 yannick-couzinie

Hello, and thank you for your feature request. We have currently a project working on derivatives of the kernel matrices. So I'm optimistic that this is indeed a feature that we can deliver soon.

David-Kreplin avatar Apr 17 '24 16:04 David-Kreplin

Great! Looking forward to it.

yannick-couzinie avatar Apr 17 '24 23:04 yannick-couzinie

We implemented a function for evaluating derivatives of the Projected Quantum Kernels (see pull request #277) Currently it is available only in the development version, but there will be soon a new release of version 0.8.

The function is called ProjectedQuantumKernel.evaluate_derivatives and it computes derivatives w.r.t. features or parameters.

David-Kreplin avatar Sep 06 '24 07:09 David-Kreplin

Thank you very much for you quick work, as this was more a feature request issue I close it now.

yannick-couzinie avatar Sep 29 '24 06:09 yannick-couzinie