antalszava
antalszava
Thank you for the suggestion! Likely that will require the change to be made in Torch. I can get back on whether or not the update solves the issue.
Seems like the circular dependency could be resolved by having module imports: https://stackoverflow.com/a/22210807/12899253 Edit: locally I also had to separate parsing logging optionsfunction into a `load_logging_config()` and used this in...
@albi3ro has this been resolved?
@josh146 this seems to be available by now, right?
@josh146 oh that's a great point! I think so :thinking: Should the `QubitDevice` caching ability be deprecated then? We could have it on the roadmap.
Cool, added it :+1:
Hi @vbelis, thanks for the report! Indeed, this is something that the plugin is doing under the hood, as such, it's a historical behaviour. Have you come across any downsides...
Thanks for the details @vbelis! Indeed, the result that we get will be a larger bitstring that has to then be post-processed. The change you suggest could indeed tidy up...
Hi @vbelis, That occurs in the [`QubitDevice.statistics`](https://github.com/PennyLaneAI/pennylane/blob/master/pennylane/_qubit_device.py#L355) and is done by more specific methods like [`QubitDevice.expval`](https://github.com/PennyLaneAI/pennylane/blob/master/pennylane/_qubit_device.py#L763) in the PennyLane repository. Note that [`QubitDevice.sample`](https://github.com/PennyLaneAI/pennylane/blob/d201dd52def0dfa44efd485e06ea06defda22dc0/pennylane/_qubit_device.py#L815) will use the underlying samples stored in...
PR was closed automatically as the `dev` branch was recreated. Re-opened the PR and now it targets `master`. If the demo does use any post-v0.26.0 features from PennyLane, the target...