Results 8 issues of Jasmine Stone

batch number is incremented once at the beginning of training loop and then once within each loop, or twice when curriculum is being used. fix this by using different batch...

get_trial_batch() makes a new generator each time and so does not increment the batch number when it is called successively. A local generator is used within the training loop so...

Previously, there was a hard coded delay between the end of the stimulus and the beginning of the output, but the mask started having the network match the output when...

Problem: when running the same code multiple times with the same seeds, there are small numerical differences that arise over the course of training. This is fixed if array sizes...

Implicitly differentiate through CG like Jax does and like Rutten et al's GPFADS paper suggests.

K_half is built to allow different time points for different trials, but we should probably built in functionality to not do redundant calculations when they are in fact identical _Originally...

This PR adds GPFA kernels (tested) and an associated example notebook. I made my best guesses to keep things stylistically in line with the rest of the codebase -- please...