micrograd icon indicating copy to clipboard operation
micrograd copied to clipboard

handle single input neuron returned by layer with single output

Open jonasrauber opened this issue 4 years ago • 0 comments

The Layer class special cases single output neurons:

return out[0] if len(out) == 1 else out

This is nice for loss functions but currently breaks for intermediate layers with a single neuron.

from micrograd.nn import MLP
from micrograd.engine import Value

# works
model = MLP(2, [16, 16, 1])
x = [Value(1.0), Value(-2.0)]
print(model(x))

# fails: intermediate layer with 1 neuron
model = MLP(2, [16, 1, 1])
x = [Value(1.0), Value(-2.0)]
print(model(x))

This PR fixes this by correctly handling a single input Value.

Note that I currently only check for Value instances explicitly, which is good enough for intermediate layers.

To handle single network input neurons, we should also check for float and int scalars. I kept it as minimal as possible for now, but I can add that if you want. (Technically, it would probably be best to check if the input is iterable or not, but doing that right nowadays requires a try statement.)

jonasrauber avatar Jun 24 '21 12:06 jonasrauber