privacy icon indicating copy to clipboard operation
privacy copied to clipboard

low accuracy when using DPAdamGaussianOptimizer

Open carlodavid012 opened this issue 5 years ago • 1 comments

When I train on MNIST data using DPAdamGaussianOptimizer, i got very low accuracy. But it works fine when using DP-SGD.

Code:

optimizer = DPAdamGaussianOptimizer(
    l2_norm_clip=1.5,
    noise_multiplier=1.3,
    num_microbatches=250,
    learning_rate=0.25)

loss = tf.keras.losses.CategoricalCrossentropy(
    from_logits=True, reduction=tf.losses.Reduction.NONE)

model.compile(optimizer=optimizer, loss=loss, metrics=['accuracy'])

model.fit(train_data, train_labels,
          epochs=10,
          validation_data=(test_data, test_labels),
          batch_size=250)

Result:

Train on 60000 samples, validate on 10000 samples
Epoch 1/10
60000/60000 [==============================] - 80s 1ms/sample - loss: 2.3770 - acc: 0.0837 - val_loss: 2.3485 - val_acc: 0.1126
Epoch 2/10
60000/60000 [==============================] - 79s 1ms/sample - loss: 2.3523 - acc: 0.1088 - val_loss: 2.3477 - val_acc: 0.1134
Epoch 3/10
60000/60000 [==============================] - 77s 1ms/sample - loss: 2.3502 - acc: 0.1110 - val_loss: 2.3854 - val_acc: 0.0757
Epoch 4/10
60000/60000 [==============================] - 77s 1ms/sample - loss: 2.4002 - acc: 0.0610 - val_loss: 2.3855 - val_acc: 0.0756
Epoch 5/10
60000/60000 [==============================] - 76s 1ms/sample - loss: 2.3667 - acc: 0.0945 - val_loss: 2.3713 - val_acc: 0.0898
Epoch 6/10
60000/60000 [==============================] - 78s 1ms/sample - loss: 2.3687 - acc: 0.0925 - val_loss: 2.3631 - val_acc: 0.0980

carlodavid012 avatar Mar 04 '20 01:03 carlodavid012

Your Learning-Rate for the Adam optimizer seems large. You could try 0.001, as I've seen it often as recommendation

anon767 avatar May 26 '20 09:05 anon767