PsychRNN icon indicating copy to clipboard operation
PsychRNN copied to clipboard

Numerical stability is better when array sizes are powers of two

Open syncrostone opened this issue 3 years ago • 0 comments

Problem: when running the same code multiple times with the same seeds, there are small numerical differences that arise over the course of training. This is fixed if array sizes are powers of two.

Suggestion: Use array sizes that are powers of two for now

Eventually I would like to implement a workaround (if tensorflow doesn't have a way to activate a built in one) where if the array size is not a power of two, in the background an array with dimensions that are powers of two is made and unneeded entries are set to 0. If this is relevant to you and you want to work on that workaround please do (and drop a comment here so people don't duplicate work).

syncrostone avatar Jul 13 '22 01:07 syncrostone