micrograd icon indicating copy to clipboard operation
micrograd copied to clipboard

A tiny scalar-valued autograd engine and a neural net library on top of it with PyTorch-like API

Results 74 micrograd issues
Sort by recently updated
recently updated
newest added

The `Layer` class special cases single output neurons: ```python3 return out[0] if len(out) == 1 else out ``` This is nice for loss functions but currently breaks for intermediate layers...

I've added a `requires_grad` flag to the `Value` class that defaults to `False`, mimicking PyTorch's behavior. The topological sort of the computational graph can then stop expanding nodes that don't...

@karpathy Hi, Great Work.... simple and clean. I was inspired and made my own mini deep learning library :P I'm thinking can we extend this with GPU support? I will...

Adding higher-order gradients through small changes.

I have added `exp` function using `math.exp` in `micrograd/engine.py` I will add tests once the changes are approved. @karpathy, please review

added some functions in the 'Value' class inside engine.py. Trained MNIST using the modified engine.py

**adds continious integration ([on macos/linux/windows with python 3.6-8](https://github.com/fcakyon/micrograd/actions/runs/81750564)), pypi publish and conda publish workflows using github actions** adds features: - automatically performs style check and unittests at every push/merge to...

The multiplication sign label change to better visualize on the graph (aligned in a middle)

Hi, Andrej, Thanks for this excellent library! It may be useful not only for Python developers, but for C#, F#, Pascal etc. developers too, so I wrote a C# port...