Huizerd

Results 8 comments of Huizerd

I think this can be implemented rather easily by adding a `detach` item to e.g. `LIFParameters`, where `detach=False` would mean BPTT, and `detach=True` would mean e-prop or local learning. As...

@skiwan I don't follow you, could you elaborate/give an example? :) How could you specify when to block/not block gradients with the same call to the threshold?

e-prop only blocks gradients between timesteps (for the reset for instance), not between layers. Think of it as a 1-step truncated BPTT. So @Jegp I think this shouldn't be a...

> I see, that makes a lot of sense, actually. And if it's only related to the previous spikes, wouldn't it be possible to even create a wrapper module that...

> @smalltimer Thanks for the input. I think I'd agree with the separation, but I would also enjoy some form of hierarchy to ensure that it's still accessible for new...

@Jegp Sorry I agree. The current `LIFCell` etc. indeed resemble the RNN module in PyTorch. If someone would want a recurrent layer with custom synapses they could make one themselves...

@ChauhanT Ok, why would you want to bundle synapses with their postsynaptic neuron? Why not just use something `SequentialState`-like if you want, with only the basic building blocks in Norse...

@djsaunde any progress on this? I'll start looking into it, because I'm working with pretty small networks and GPUs won't give you much of an advantage there. This seems to...