HN2new | past | comments | ask | show | jobs | submitlogin
Ultra simplified "MNIST" in 60 lines of Python with NumPy (github.com/tonio-m)
37 points by tonio on July 11, 2024 | hide | past | favorite | 6 comments


That reminds me of the article „a neural network in 11 lines of code“ from 2015 (although not on MNIST, rather some kind of XOR)

https://iamtrask.github.io/2015/07/12/basic-python-network/


Beautiful! Same in spirit as what I was going for, but even better. I really like the way the weights are updated in this one. Thanks for showing me this!


Tinygrad offers a superior MNIST implementation with minimal dependencies[0].

[0] https://github.com/tinygrad/tinygrad/blob/master/docs/mnist....


Autograd hides the backward pass in that implementation, but this code spells it out explicitly.


The description: "no dependencies."

The first line: "import numpy as np"


you got me, lol. to be fair the only reason numpy is there is because truly pure python doesn't have matmuls, and I didn't think reimplementing it wouldn't serve a didactic purpose.

A cool idea for a v2 without numpy would be implementing matmuls with lists, matrix transpose with zip() and switching np.exp for math.exp. And getting all that on fewer lines as possible.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: