Difference in Accuracy between setting epsilon to 10^6 and model wiithout opacus to train.
🐛 Bug
I tried 2 variations and trained my models one is setting epsilon to 10^6 and other is training the model simply without opacus. I find that results of model 1 is much better than model 2 in terms of AUROC. I have the following questions.
- I would like to know the reason for that as setting epsilon to 10^6 is almost equivalent to infinity.
- Is there a feature in privacy engine where epsilon has an option for infinite value (In other words no DP) rather than manually setting the value.
- If not what will it be introduced in future and What parameters I need to set in privacy engine to achieve No DP (Like gradient clipping, noise etc.)
Thank you
Please reproduce using our template Colab and post here the link
To Reproduce
:warning: We cannot help you without you sharing reproducible code. Do not ignore this part :) Steps to reproduce the behavior:
Expected behavior
Environment
Please copy and paste the output from our environment collection script (or fill out the checklist below manually).
You can get the script and run it with:
wget https://raw.githubusercontent.com/pytorch/pytorch/master/torch/utils/collect_env.py
# For security purposes, please check the contents of collect_env.py before running it.
python collect_env.py
- PyTorch Version (e.g., 1.0):
- OS (e.g., Linux):
- How you installed PyTorch (
conda,pip, source): - Build command you used (if compiling from source):
- Python version:
- CUDA/cuDNN version:
- GPU models and configuration:
- Any other relevant information:
Additional context
Thanks for raising this issue. There are two differences between non-private and private training: clipping and noising. If you want to have epsilon infinity you can set noise_multiplier=0 in the call to make_private.
I believe the difference you observe is due to clipping (max_grad_norm). You can test that by also setting clipping to a very high value or checking what multiple values of clipping yield in terms of accuracy/AUC.