privacy
privacy copied to clipboard
low accuracy when using DPAdamGaussianOptimizer
When I train on MNIST data using DPAdamGaussianOptimizer, i got very low accuracy. But it works fine when using DP-SGD.
Code:
optimizer = DPAdamGaussianOptimizer(
l2_norm_clip=1.5,
noise_multiplier=1.3,
num_microbatches=250,
learning_rate=0.25)
loss = tf.keras.losses.CategoricalCrossentropy(
from_logits=True, reduction=tf.losses.Reduction.NONE)
model.compile(optimizer=optimizer, loss=loss, metrics=['accuracy'])
model.fit(train_data, train_labels,
epochs=10,
validation_data=(test_data, test_labels),
batch_size=250)
Result:
Train on 60000 samples, validate on 10000 samples
Epoch 1/10
60000/60000 [==============================] - 80s 1ms/sample - loss: 2.3770 - acc: 0.0837 - val_loss: 2.3485 - val_acc: 0.1126
Epoch 2/10
60000/60000 [==============================] - 79s 1ms/sample - loss: 2.3523 - acc: 0.1088 - val_loss: 2.3477 - val_acc: 0.1134
Epoch 3/10
60000/60000 [==============================] - 77s 1ms/sample - loss: 2.3502 - acc: 0.1110 - val_loss: 2.3854 - val_acc: 0.0757
Epoch 4/10
60000/60000 [==============================] - 77s 1ms/sample - loss: 2.4002 - acc: 0.0610 - val_loss: 2.3855 - val_acc: 0.0756
Epoch 5/10
60000/60000 [==============================] - 76s 1ms/sample - loss: 2.3667 - acc: 0.0945 - val_loss: 2.3713 - val_acc: 0.0898
Epoch 6/10
60000/60000 [==============================] - 78s 1ms/sample - loss: 2.3687 - acc: 0.0925 - val_loss: 2.3631 - val_acc: 0.0980
Your Learning-Rate for the Adam optimizer seems large. You could try 0.001, as I've seen it often as recommendation