David G. Clark
david-g-clark.bsky.social
David G. Clark
@david-g-clark.bsky.social
Theoretical neuroscientist
Research fellow @ Kempner Institute, Harvard
dclark.io
Pinned
Now in PRX: Theory linking connectivity structure to collective activity in nonlinear RNNs!
For neuro fans: conn. structure can be invisible in single neurons but shape pop. activity
For low-rank RNN fans: a theory of rank=O(N)
For physics fans: fluctuations around DMFT saddle⇒dimension of activity
Connectivity Structure and Dynamics of Nonlinear Recurrent Neural Networks
The structure of brain connectivity predicts collective neural activity, with a small number of connectivity features determining activity dimensionality, linking circuit architecture to network-level...
journals.aps.org
🆒
Our next paper on comparing dynamical systems (with special interest to artificial and biological neural networks) is out!! Joint work with @annhuang42.bsky.social , as well as @satpreetsingh.bsky.social , @leokoz8.bsky.social , Ila Fiete, and @kanakarajanphd.bsky.social : arxiv.org/pdf/2510.25943
November 10, 2025 at 6:30 PM
Reposted by David G. Clark
The term “manifold” comes from “Mannigfaltigkeit,” which is German for “variety” or “multiplicity.” www.quantamagazine.org/what-is-a-ma...
November 8, 2025 at 9:04 PM
Reposted by David G. Clark
This was a lot of fun! From my side, it started with a technical Q: what's the relation between two-side cavity and path integrals? Turns out it's a fluctuation correction - and amazingly, this also enable the "O(N) rank" theory by @david-g-clark.bsky.social and @omarschall.bsky.social. 🤯
Now in PRX: Theory linking connectivity structure to collective activity in nonlinear RNNs!
For neuro fans: conn. structure can be invisible in single neurons but shape pop. activity
For low-rank RNN fans: a theory of rank=O(N)
For physics fans: fluctuations around DMFT saddle⇒dimension of activity
Connectivity Structure and Dynamics of Nonlinear Recurrent Neural Networks
The structure of brain connectivity predicts collective neural activity, with a small number of connectivity features determining activity dimensionality, linking circuit architecture to network-level...
journals.aps.org
November 5, 2025 at 9:15 AM
Reposted by David G. Clark
Can confirm this was a fun project! My favorite takeaway is that the (low-but-extensive) rank of a network can be used as a knob for controlling dimensionality while leaving single-neuron properties unchanged.
Now in PRX: Theory linking connectivity structure to collective activity in nonlinear RNNs!
For neuro fans: conn. structure can be invisible in single neurons but shape pop. activity
For low-rank RNN fans: a theory of rank=O(N)
For physics fans: fluctuations around DMFT saddle⇒dimension of activity
Connectivity Structure and Dynamics of Nonlinear Recurrent Neural Networks
The structure of brain connectivity predicts collective neural activity, with a small number of connectivity features determining activity dimensionality, linking circuit architecture to network-level...
journals.aps.org
November 4, 2025 at 3:53 PM
Reposted by David G. Clark
Simultaneous detection and estimation in olfactory sensing https://www.biorxiv.org/content/10.1101/2025.11.01.686013v1
November 3, 2025 at 11:15 PM
Now in PRX: Theory linking connectivity structure to collective activity in nonlinear RNNs!
For neuro fans: conn. structure can be invisible in single neurons but shape pop. activity
For low-rank RNN fans: a theory of rank=O(N)
For physics fans: fluctuations around DMFT saddle⇒dimension of activity
Connectivity Structure and Dynamics of Nonlinear Recurrent Neural Networks
The structure of brain connectivity predicts collective neural activity, with a small number of connectivity features determining activity dimensionality, linking circuit architecture to network-level...
journals.aps.org
November 3, 2025 at 9:47 PM
Reposted by David G. Clark
I did a QA with Quanta about interpretability and training dynamics! I got to talk about a bunch of research hobby horses and how I got into them.
September 24, 2025 at 1:57 PM
Reposted by David G. Clark
I’m super excited to finally put my recent work with @behrenstimb.bsky.social on bioRxiv, where we develop a new mechanistic theory of how PFC structures adaptive behaviour using attractor dynamics in space and time!

www.biorxiv.org/content/10.1...
September 24, 2025 at 12:04 PM
Reposted by David G. Clark
🎉 "High-dimensional neuronal activity from low-dimensional latent dynamics: a solvable model" will be presented as an oral at #NeurIPS2025 🎉

Feeling very grateful that reviewers and chairs appreciated concise mathematical explanations, in this age of big models.

www.biorxiv.org/content/10.1...
1/2
September 19, 2025 at 8:01 AM
A great reading list for historical+recent RNN theory
Just got back from a great summer school at Sapienza University sites.google.com/view/math-hi... where I gave a short course on Dynamics and Learning in RNNs. I compiled a (very biased) list of recommended readings on the subject, for anyone interested: aleingrosso.github.io/_pages/2025_...
Home
The school will open the thematic period on Data Science and will be dedicated to the mathematical foundations and methods for high-dimensional data analysis. It will provide an in-depth introduction ...
sites.google.com
September 15, 2025 at 12:25 PM
One for the books
Motor cortex flexibly deploys a high-dimensional repertoire of subskills https://www.biorxiv.org/content/10.1101/2025.09.07.674717v1
September 8, 2025 at 2:12 PM
Reposted by David G. Clark
I'm excited to share that my new postdoctoral position is going so well that I submitted a new paper at the end of my first week! www.biorxiv.org/content/10.1... A thread below
Sensory Compression as a Unifying Principle for Action Chunking and Time Coding in the Brain
The brain seamlessly transforms sensory information into precisely-timed movements, enabling us to type familiar words, play musical instruments, or perform complex motor routines with millisecond pre...
www.biorxiv.org
September 6, 2025 at 2:36 PM
(1/26) Excited to share a new preprint led by grad student Albert Wakhloo, with me and Larry Abbott: "Associative synaptic plasticity creates dynamic persistent activity."
www.biorxiv.org/content/10.1...
Associative synaptic plasticity creates dynamic persistent activity
In biological neural circuits, the dynamics of neurons and synapses are tightly coupled. We study the consequences of this coupling and show that it enables a novel form of working memory. In recurren...
www.biorxiv.org
August 25, 2025 at 5:17 PM
Wanted to share a new version (much cleaner!) of a preprint on how connectivity structure shapes collective dynamics in nonlinear RNNs. Neural circuits have highly non-iid connectivity (e.g., rapidly decaying singular values, structured singular-vector overlaps), unlike classical random RNN models.
Connectivity structure and dynamics of nonlinear recurrent neural networks
Studies of the dynamics of nonlinear recurrent neural networks often assume independent and identically distributed couplings, but large-scale connectomics data indicate that biological neural circuit...
arxiv.org
August 19, 2025 at 3:42 PM
Reposted by David G. Clark
Very happy about my former mentor Sara Solla having received the Valentin Braitenberg Award for her lifelong contributions to computational neuroscience!

Sara will be giving a lecture at the upcoming @bernsteinneuro.bsky.social meeting which you shouldn't miss.

bernstein-network.de/en/newsroom/...
Sara A. Solla receives the Valentin Braitenberg Award for Computational Neuroscience 2025 – Bernstein Network Computational Neuroscience
bernstein-network.de
August 6, 2025 at 2:25 PM
Reposted by David G. Clark
Coming March 17, 2026!
Just got my advance copy of Emergence — a memoir about growing up in group homes and somehow ending up in neuroscience and AI. It’s personal, it’s scientific, and it’s been a wild thing to write. Grateful and excited to share it soon.
August 4, 2025 at 4:21 PM
Cool!
July 31, 2025 at 6:13 PM
Reposted by David G. Clark
#KempnerInstitute research fellow @andykeller.bsky.social and coauthors Yue Song, Max Welling and Nicu Sebe have a new book out that introduces a framework for developing equivariant #AI & #neuroscience models. Read more:

kempnerinstitute.harvard.edu/news/kempner...

#NeuroAI
Kempner Research Fellow Andy Keller Wants to Improve How AI Systems Represent a Dynamic World - Kempner Institute
Humans have a powerful ability to recognize patterns in a dynamic, ever-changing world, allowing for problem-solving and other cognitive abilities that are the hallmark of intelligent behavior. Yet de...
kempnerinstitute.harvard.edu
July 29, 2025 at 5:42 PM
Reposted by David G. Clark
When neurons change, but behavior doesn’t: Excitability changes driving representational drift

New preprint of work with Christian Machens: www.biorxiv.org/content/10.1...
Representational drift without synaptic plasticity
Neural computations support stable behavior despite relying on many dynamically changing biological processes. One such process is representational drift (RD), in which neurons' responses change over ...
www.biorxiv.org
July 29, 2025 at 2:02 PM
Reposted by David G. Clark
The summer schools at Les Houches are a magnificent tradition. I was honored to lecture there in 2023, and my notes now are published as "Ambitions for theory in the physics of life." #physics #physicsoflife scipost.org/SciPostPhysL...
SciPost: SciPost Phys. Lect. Notes 84 (2024) - Ambitions for theory in the physics of life
SciPost Journals Publication Detail SciPost Phys. Lect. Notes 84 (2024) Ambitions for theory in the physics of life
scipost.org
July 25, 2025 at 9:57 PM
Reposted by David G. Clark
Trying to train RNNs in a biol plausible (local) way? Well, try our new method using predictive alignment. Paper just out in Nat. Com. Toshitake Asabuki deserves all the credit!
www.nature.com/articles/s41...
www.nature.com
July 23, 2025 at 12:10 PM
Reposted by David G. Clark
New in the #DeeperLearningBlog: #KempnerInstitute research fellow @andykeller.bsky.social introduces the first flow equivariant neural networks, which reflect motion symmetries, greatly enhancing generalization and sequence modeling.

bit.ly/451fQ48

#AI #NeuroAI
Flow Equivariant Recurrent Neural Networks - Kempner Institute
Sequence transformations, like visual motion, dominate the world around us, but are poorly handled by current models. We introduce the first flow equivariant models that respect these motion symmetrie...
bit.ly
July 22, 2025 at 1:21 PM
Excited!
July 16, 2025 at 7:05 PM