Spiros Chavlis
banner
spiroschav.bsky.social
Spiros Chavlis
@spiroschav.bsky.social
Postdoc @dendritesgr.bsky.social
Passionate about math, neuro & AI

dendrites.gr
🌟 Overall, we show that implementing dendritic properties can significantly enhance the learning capabilities of ANNs, boosting accuracy and efficiency. These findings hold great promise for optimizing the sustainability and effectiveness of ML algorithms! 🧠✨ #AI #MachineLearning #Dendrites (14/14)
January 31, 2025 at 9:27 AM
🔍 Finally, we crafted challenging scenarios for traditional ANNs, starting with added noise and sequentially feeding batches of the same class. Our findings show that dANNs with RFs exhibit greater robustness, accuracy, and efficiency, especially as task difficulty increases. (13/14)
January 31, 2025 at 9:27 AM
🔍Rather than becoming class-specific early, dANNs show mixed-selectivity in both layers. This enhances trustworthy representations, achieving high accuracy with less overfitting and fewer, fully utilized params. (12/14)
January 31, 2025 at 9:27 AM
🔍 To understand dANN's edge over vANNs, we analyzed weight distributions after training on Fashion MNIST. ANNs fully utilize their parameters, especially dendrosomatic weights. Entropy and selectivity distributions also indicate different strategies for tackling the same classification task. (11/14)
January 31, 2025 at 9:27 AM
🔍 Our findings highlight that structured connectivity and restricted input sampling in dANNs yield significant efficiency gains in image classification over classical vANNs! When comparing dANNs and pdANN to vANN, we found that RFs boost efficiency, but not to the extent of dANNs. (10/14)
January 31, 2025 at 9:27 AM
📈 To validate the benefits of dendritic features, we tested dANN models on five benchmark datasets. Results showed that top dANN models matched or even outperformed the best vANNs in accuracy and loss! Additionally, dANNs proved significantly more efficient across all datasets. (9/14)
January 31, 2025 at 9:27 AM
Our dANN and pdANN models show improved learning with network sizes, lower loss, and better accuracy! More importantly, they maintain performance and stability as the number of layers increases. This reveals their potential for deeper architectures! 🧠💪 (8/14)
January 31, 2025 at 9:27 AM
🔍 We explored three input sampling methods for dendritic ANN models (dANN): a) random (R), b) local receptive fields (LRF), and c) global receptive fields (GRF). We also included a fully connected sampling (F), calling it a partly-dendritic ANN (pdANN) 🧠✨. (7/14)
January 31, 2025 at 9:27 AM
🌱 Our proposed architecture features partially sampled inputs fed into a structured dendritic layer, which connects sparsely to the somatic layer! 🧠✨

Inspired by the receptive fields of visual cortex neurons, this approach mimics the locally connected networks. (6/14)
January 31, 2025 at 9:27 AM
Inspired by the work of @ilennaj.bsky.social and @kordinglab.bsky.social, doi.org/10.1162/neco..., we propose a bio-inspired dendritic architecture to enhance learning in ANNs using backpropagation! (5/14)
Might a Single Neuron Solve Interesting Machine Learning Problems Through Successive Computations on Its Dendritic Tree?
Physiological experiments have highlighted how the dendrites of biological neurons can nonlinearly process distributed synaptic inputs. However, it is unclear how aspects of a dendritic tree, such as ...
doi.org
January 31, 2025 at 9:27 AM
🌿 Dendrites generate local regenerative events and mimic the spiking profile of a soma, acting like multi-layer ANNs! 🤯 doi.org/10.1016/s089...

Dends enable complex computations, like logical operations, signal amplification, and more 🧠💡
doi.org/10.1016/j.co...
www.nature.com/articles/s41... (4/14)
Redirecting
doi.org
January 31, 2025 at 9:27 AM
🧠 The biological brain processes, stores, and retrieves vast info quickly and efficiently, using minimal energy! ⚡️ Meanwhile, ML/AI systems are energy-hungry! 🤖💡 Our solution? Dendrites! 🌱✨(3/14)
January 31, 2025 at 9:27 AM
Before we dive deeper into it, I would like to thank my amazing supervisor, @yiotapoirazi.bsky.social, for all her support and kindness during my 12 years in the @dendritesgr.bsky.social. The paper can be found @naturecomms.bsky.social www.nature.com/articles/s41... (2/14)
Dendrites endow artificial neural networks with accurate, robust and parameter-efficient learning - Nature Communications
Artificial neural networks, central to deep learning, are powerful but energy-consuming and prone to overfitting. The authors propose a network design inspired by biological dendrites, which offe...
www.nature.com
January 31, 2025 at 9:27 AM