Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Gradient descent and backpropagation don't take place in the brain.

Not exactly, no, but the 'neurons that fire together wire together' way of learning has a pretty similar effect.

> LLMs "learn" in the same way that Excel sheets "learn".

I've never seen an excel sheet do anything like backpropagation.



> I've never seen an excel sheet do anything like backpropagation.

Not strictly in the sense you mentioned (assuming that you mean "by themselves") but people may find [1] and [2] interesting.

[1] https://pub.towardsai.net/building-a-neural-network-with-bac...

[2] https://towardsdatascience.com/demystifying-feed-forward-and...


Sadly, I have seen one. It was a vba script from the late 90s that used a simple dense multilayer network to do some unsupervised pattern classification. The linear algebra tools in vba/excel along with the solvers are all native dll code and the vba itself is all AOT compiled to native, so it typically runs very fast, and for small matrices it beats out numpy by an order of magnitude due to the ffi overhead. Was it the wrong tool? It depends on your constraints, but probably. It did work though.


Hebbian learning and backprop are not comparable and they don’t have a similar effect in any meaningful sense.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: