make_private with clipping parameter
🐛 Bug
Hi everyone, I was trying to use the function make_private to wrap my Pytorch training objects. I wanted to use the AdaClipDPOptimizer and I found that there is a parameter "clipping" in the make_private function to use it. I passed "adaptive" as a parameter to the make_private function but it doesn't work. Link to colab: https://colab.research.google.com/drive/1VivVsyU31onR1EAePuQQUMRzGGRgAi94?usp=sharing
Please reproduce using our template Colab and post here the link
To Reproduce
:warning: We cannot help you without you sharing reproducible code. Do not ignore this part :) Steps to reproduce the behavior:
- Call the make_private function setting the clipping parameter as "adaptive"
The error is TypeError: init() missing 5 required keyword-only arguments: 'target_unclipped_quantile', 'clipbound_learning_rate', 'max_clipbound', 'min_clipbound', and 'unclipped_num_std'. This happens because the AdaClipDPOptimizer is expecting other parameters. However, when the optimizer is instantiated in the _prepare_optimizer function, only a few parameters are passed.
Expected behavior
Environment
Please copy and paste the output from our environment collection script (or fill out the checklist below manually).
You can get the script and run it with:
wget https://raw.githubusercontent.com/pytorch/pytorch/master/torch/utils/collect_env.py
# For security purposes, please check the contents of collect_env.py before running it.
python collect_env.py
- PyTorch Version (e.g., 1.0):
- OS (e.g., Linux):
- How you installed PyTorch (
conda,pip, source): - Build command you used (if compiling from source):
- Python version:
- CUDA/cuDNN version:
- GPU models and configuration:
- Any other relevant information:
Additional context
Thanks for raising this issue! It looks like it's just a matter of passing down the parameters to the optimizer? I can take a look during the week, if you want to make a PR I'm happy to look at it as well.
Thanks for raising this issue! It looks like it's just a matter of passing down the parameters to the optimizer? I can take a look during the week, if you want to make a PR I'm happy to look at it as well.
Hi @alexandresablayrolles I'd like to fix this and to make a PR. Am I still in time?
Absolutely!
Hi! I'm also trying to use adaptive clipping, getting the same error. Has this issue been fixed ?
is there any update on this issue? I would like also to apply adaptive clipping!
hi @HuanyuZhang, I have opened a pr with the fix. please lmk if any further changes are required.