Roman Pogodin
rmnpogodin.bsky.social
Roman Pogodin
@rmnpogodin.bsky.social
Postdoc at Mila/McGill
We’ve done some cool work looking for learning algorithms in the brain! Tl;dr if you apply credit signals multiplicatively to each synapse, you fit Dale’s law, log normal weights, and also prune better and learn faster in very noisy setups
Why does #compneuro need new learning methods? ANN models are usually trained with Gradient Descent (GD), which violates biological realities like Dale’s law and log-normal weights. Here we describe a superior learning algorithm for comp neuro: Exponentiated Gradients (EG)! 1/12 #neuroscience 🧪
October 29, 2024 at 8:32 PM