distributed-learning-contributivity icon indicating copy to clipboard operation
distributed-learning-contributivity copied to clipboard

What happens when we clip batch_size ?

Open RomainGoussault opened this issue 5 years ago • 2 comments

The batch size is clipped here --> https://github.com/SubstraFoundation/distributed-learning-contributivity/blob/b72fa98c0b4db45d368f577d0f6d1a861b1610c2/scenario.py#L584

So if it's clipped it means that sometimes we don't use all the data in the dataset. But we don't give any feedback to the user. We should check when it happens and inform somehow the user.

RomainGoussault avatar Sep 11 '20 11:09 RomainGoussault

  • [ ] Update MAX_BATCH_SIZE (value to be determined)
  • [ ] Add a log when the clipping is triggered (log leve INFO)

bowni avatar Sep 22 '20 10:09 bowni

I think that the MAX_BATCH_SIZE should be related to the GPU memory available, so it will be dataset dependant

arthurPignet avatar Feb 06 '21 15:02 arthurPignet