Eva Yi Xie
evayixie.bsky.social
Eva Yi Xie
@evayixie.bsky.social
Comp Neuro PhD student @ Princeton. Visiting Scientist @ Allen Institute. MIT’24
https://minzsiure.github.io
Thank you 😊 That means a lot to hear!
October 30, 2025 at 8:24 PM
October 30, 2025 at 3:08 PM
9/9 Lastly, we thank the colleagues @alleninstitute.org and @cosynemeeting.bsky.social for their insightful feedback on an early version of this work! Happy to chat: [email protected]; [email protected].
October 30, 2025 at 3:01 PM
8/ @tyrellturing.bsky.social ‘s group recently shows brain-like learning with exponentiated gradients naturally gives rise to log-normal connectivity distributions—our results offer a theoretical perspective that elucidates the dynamical consequences of these heavy-tailed structures.
October 30, 2025 at 2:59 PM
7/ For more details, implications of our results to neuroscience 🧠 and machine learning 🤖, + exciting future directions, please check out our full paper or visit our poster at #NeurIPS2025:

🔗OpenReview: openreview.net/forum?id=J0S...
📍Code: github.com/AllenInstitu...
Slow Transition to Low-Dimensional Chaos in Heavy-Tailed Recurrent...
Growing evidence suggests that synaptic weights in the brain follow heavy-tailed distributions, yet most theoretical analyses of recurrent neural networks (RNNs) assume Gaussian connectivity. We...
openreview.net
October 30, 2025 at 2:57 PM
6/ Conclusion: Our results reveal a biologically aligned tradeoff between the robustness of dynamics and the richness of neural activity. Our results provide a tractable framework for understanding dynamics in realistically sized, heavy-tailed neural circuits.
October 30, 2025 at 2:57 PM
5/ ‼️Result 3: However, this robustness of slow transition comes with a tradeoff ↔️: heavier tails reduce the Lyapunov dimension of the network attractor, indicating lower effective dimensionality.
October 30, 2025 at 2:57 PM
4/ (Side note: The computational benefit of being near the edge of chaos is well established for both feedforward and recurrent neural networks. We validate in Appendix L this indeed translates to improved info processing in simple reservoir-computing tasks. 🤖🧠)
October 30, 2025 at 2:56 PM
3/ 🔎Result 2: Compared to Gaussian networks, we found finite heavy-tailed RNNs exhibit a broader gain regime near the edge of chaos: a *slow* transition to chaos. 🐢
October 30, 2025 at 2:56 PM
2/ 🔎Result 1: While mean-field theory for the infinite system predicts ubiquitous chaos, our analysis reveals *finite-size* RNNs have a sharp transition between quiescent & chaotic dynamics. 

We theoretically predict the gain of transition and validated it through simulations.
October 30, 2025 at 2:56 PM
1/ Setup: With @mihalas.bsky.social and Lukasz Kusmierz, We study RNNs with weights drawn from biologically plausible Lévy alpha-stable distributions, generalizing the Gaussian distribution to heavy tails.
October 30, 2025 at 2:55 PM