Igor Shilov

Results 20 comments of Igor Shilov

Is there any benefit other than code clarity? Like memory or support for higher dimensions? If yes, I'd say it's fine to leave it as alternative grad_sampler - not wrap...

Hi As of this moment opacus doesn't support fp16. This issue is created to track progress of adding the support. As of today it's not something planned for the near...

Note to the hero who'll be working on this. As a part of pull request for this issue, please re-enable test for `DPMultiheadAttention` (`tests/dp_layers/dp_multihead_attention_test.py:test_attn`). We've temporarily disabled it, until this...

Whoops, thanks for pointing out the old colab in the bug report template - fixed now. With full backward hooks, unfortunately, it's not as simple as just replacing the deprecated...

One idea on how to approach this: functional testing. Static analysis might prove very challenging, but running a batch and then comparing the results with individually-run examples looks easy. We...

Small update: now that we have benchmarks available, there's an easy way to evaluate the impact of this optimization. See [this guide](https://github.com/pytorch/opacus/tree/main/benchmarks#usage) on how to run benchmarks.

It could - looking at the code, `_weight` and `_bias` are only used in `F.linear` calls, which indicates that `ExpandedWeight` should be able to handle it. Please note that ExpandedWeight...

Huh, this should not happen - it's an error in our validation code, thanks for helping us uncover that. Fix is coming shortly

So the reason for the failing tests in that `padding="same"` introduces uneven padding, which previously wasn't supported by either PyTorch or opacus. All numerical values of padding in convolution layers...

@ashkan-software do you know why `commit` CircleCI tests weren't triggered? LGTM overall, I can accept if all the tests and linters pass