David G. Clark
david-g-clark.bsky.social
David G. Clark
@david-g-clark.bsky.social
Theoretical neuroscientist
Research fellow @ Kempner Institute, Harvard
dclark.io
For fans, like me, of working with @omarschall.bsky.social, @avm.bsky.social, and Ashok Litwin-Kumar: a great time!
November 3, 2025 at 9:47 PM
(26/26) Finally, one more HUGE shoutout to Albert Wakhloo for conceiving, calculating, and charting our way through this fascinating project.

(Link, again: www.biorxiv.org/content/10.1...)
August 25, 2025 at 5:17 PM
(25/26) This work emphasizes that understanding memory-related neural activity requires modeling synaptic and neuronal dynamics together. Separating these processes, while convenient, can obscure circuit functions. Coupling enables new forms of computation beyond what either process achieves alone.
August 25, 2025 at 5:17 PM
(24/26) This mechanism is evocative of experimental findings in motor cortex and sensory areas that reveal apparent constraints on neural activity patterns during learning (e.g., from Yu, Batista, Chase, et al.).
August 25, 2025 at 5:17 PM
(23/26) Some concluding thoughts. In many models of synaptic plasticity-based learning, weight updates simply overwrite existing connectivity. Our model points out that plasticity can instead dramatically shape dynamics by manipulating the dynamic reservoir provided by static backbone connectivity.
August 25, 2025 at 5:17 PM
(22/26) Furthermore, studies report persistent oscillations following periodic stimuli & phase-locking to LFP oscillations during WM tasks, interpreted as evidence for intrinsic oscillatory circuitry. Our results suggest this may arise via ongoing plasticity, without preexisting circuit structure.
August 25, 2025 at 5:17 PM
(21/26) What about experimental links? Many working-memory (WM) studies report complex dynamic activity following stim. cessation, not aligning neatly with "sustained firing" WM theories. Our results suggest such activity could be generated by Hebbian plasticity, also underlying canonical WM models.
August 25, 2025 at 5:17 PM
(20/26) In sum, we have arrived at a conceptual understanding of, and analytical solution to, the behavior of coupled neuronal-synaptic dynamics in a nonlinear, input-driven recurrent network. In particular, we have shown that this behavior enables a useful computational function: dynamic memory.
August 25, 2025 at 5:17 PM
(19/26) Furthermore, we show that, while Ψ is not in general an eigenvector of J, it becomes an increasingly good approximate eigenvector as ν → g⁺. Thus, the mechanism is essentially the same as in the targeted case.
August 25, 2025 at 5:17 PM
(18/26) Using this approximation, we derive exact large-N expressions for outlier eigenvalues λ = g²ν/|ν|² + α/(|ν|² - g²). This correctly predicts the full phenomenology of persistent oscillations, including amplitude and frequency dependence, preferred frequency bands, and regime transitions.
August 25, 2025 at 5:17 PM
(17/26) Let us now return to the full, random-phase input case. We approximate A(0) ≈ 2α Re{ΨΨ†} where Ψ = (νI − J)⁻¹eⁱᶿ. Here, eⁱᶿ is a vector containing input phases θᵢ for each neuron; ν is a complex scalar that depends on the system parameters; and α = kI²/4.
August 25, 2025 at 5:17 PM