Joao Barbosa
@jbarbosa.org
3.6K followers 820 following 1.2K posts
INSERM group leader @ Neuromodulation Institute and NeuroSpin (Paris) in computational neuroscience. How and why are computations enabling cognition distributed across the brain? Expect neuroscience and ML content. jbarbosa.org
Posts Media Videos Starter Packs
Pinned
jbarbosa.org
A good time to repost this banger from 'The Responsibility of Intellectuals' by Noam Chomsky in '67

Vance is absolutely correct: universities are - and must be - the enemy.

This is why the Right is attacking scholarship globally. They want to end the privilege Chomsky's talking about in 67
Reposted by Joao Barbosa
j-b-eppler.bsky.social
Loved working with our amazing outreach team on this short video about representational drift! @crmatematica.bsky.social

🧠🧪

In it, I explain the points we make in our recent review in CONEUR:

doi.org/10.1016/j.co...
crmatematica.bsky.social
Memories feel fixed, but the brain never stands still.

@j-b-eppler.bsky.social talks about representational drift; how memories remain stable even as the neurons behind them constantly change.

youtu.be/z63fmYSBcB0

#Neuroscience #Mathematics #Memory #Brain #CognitiveScience #Neurobiology #Research
Why Your Brain Is Never Still: Representational Drift and Statistical Learning
YouTube video by Centre de Recerca Matemàtica
youtu.be
jbarbosa.org
Okok I'll read it :)
jbarbosa.org
I haven't read the paper carefully I admit, but seems like a good example of how low rank connectivity enables deep understanding of networks with more than a couple of neurons
jbarbosa.org
What do you mean? This is a great example that we can, even for the case where data is high D! :)
Reposted by Joao Barbosa
smellosopher.bsky.social
Letters of rec batch done, too.
Reposted by Joao Barbosa
Reposted by Joao Barbosa
engeltatiana.bsky.social
A study led by Cina Aghamohammadi is now out in ‪@natcomms.nature.com‬! We developed a mathematical framework for partitioning spiking variability, which revealed that spiking irregularity is nearly invariant for each neuron and decreases along the cortical hierarchy.
www.nature.com/articles/s41...
Reposted by Joao Barbosa
maxroser.bsky.social
What we die from vs. what we hear in the news.

Terrorism and homicides account for less than 1% of deaths, but for more than half of all media stories about death in the US — whether in the New York Times, the Washington Post, or Fox News.
Reposted by Joao Barbosa
jamesbriscoe.bsky.social
The @crick.ac.uk is recruiting Early Career Group Leaders

- Lab set-up, research costs, salaries for up to 5 researchers
- Support for up to 12 years
- Access to our core facilities
- Competitive salary
- Fantastic colleagues
- All areas of biology

Deadline 27 Nov

www.crick.ac.uk/careers-stud...
Early career group leaders
We appoint researchers from across biology and biomedicine to set up their first groups at the Crick.
www.crick.ac.uk
jbarbosa.org
Maravilhosas notícias!
jbarbosa.org
Just go across the street and ask him :)
jbarbosa.org
I don't think that's right. In Mongillo's you can only store in STP what is already in the connectivity/LTP, which in the case of this particular paper was just one of two attractors
jbarbosa.org
just to be clear, Mongillo's model — arguably the model from which all synaptic WM models are based — is also an RNN. I don't think the debate you were alluding to is about RNN vs something else
jbarbosa.org
Yeah, I think that's a reasonable POV. After all, the attractors have to be learned (presumably through LTM, e.g hebbian plasticity).

However, it's the same for synaptic WM: you still need the attractors in the connectivity for the model to work (see Mongillo et al, Science)
jbarbosa.org
sure, if it's low rank :)
Reposted by Joao Barbosa
daweibai.bsky.social
Happy to share that our BBS target article has been accepted: “Core Perception”: Re-imagining Precocious Reasoning as Sophisticated Perceiving
With Alon Hafri, @veroniqueizard.bsky.social, @chazfirestone.bsky.social & Brent Strickland
Read it here: doi.org/10.1017/S014...
A short thread [1/5]👇
jbarbosa.org
There's tricks to chunk items, but there's no debate WM is severely limited. Long term memory is different ofc, virtually infinite
jbarbosa.org
I think I will miss a meeting at 17.30 Nov 6...🙈
jbarbosa.org
Not sure I follow. In networks # of attractors/memories scales with # of neurons. Regarding WM, networks can do MUCH more than brains.

In fact that's *the mystery* worth studying: why can we remember just 4-7 items? Truly mysterious IMO, especially knowing WM capacity strongly correlates with IQ