#BionicVision #Blindness #LowVision #VisionScience #CompNeuro #NeuroTech #NeuroAI
👏🙏🙌 Special thanks to MS, YH, JP for daily work behind the scenes (at the expense of their own research). The challenge would not exist without them!
👏🙏🙌 Special thanks to MS, YH, JP for daily work behind the scenes (at the expense of their own research). The challenge would not exist without them!
Hear from top competitors and our 3 keynote speakers:
- @sinzlab.bsky.social
- @ninamiolane.bsky.social
- @crisniell.bsky.social
More info: robustforaging.github.io/workshop
#NeurIPS2025 #Neuroscience #AI
Hear from top competitors and our 3 keynote speakers:
- @sinzlab.bsky.social
- @ninamiolane.bsky.social
- @crisniell.bsky.social
More info: robustforaging.github.io/workshop
#NeurIPS2025 #Neuroscience #AI
🥇 371333_HCMUS_TheFangs (ASR 0.968, MSR 0.940, Score 0.954)
🥈 417856_alluding123 (ASR 0.864, MSR 0.650, Score 0.757)
🥉 366999_pingsheng-li (ASR 0.802, MSR 0.670, Score 0.736)
Full leaderboard: robustforaging.github.io/leaderboard/
#NeurIPS2025 #Neuroscience #AI
🥇 371333_HCMUS_TheFangs (ASR 0.968, MSR 0.940, Score 0.954)
🥈 417856_alluding123 (ASR 0.864, MSR 0.650, Score 0.757)
🥉 366999_pingsheng-li (ASR 0.802, MSR 0.670, Score 0.736)
Full leaderboard: robustforaging.github.io/leaderboard/
#NeurIPS2025 #Neuroscience #AI
📡 Record neural responses
⚡ Adapt stimulation in real-time
👁️ Optimize for perceptual outcomes
This work was only possible through a tight collaboration between 3 labs across @ethz.ch, @umh.es, and @ucsantabarbara.bsky.social!
📡 Record neural responses
⚡ Adapt stimulation in real-time
👁️ Optimize for perceptual outcomes
This work was only possible through a tight collaboration between 3 labs across @ethz.ch, @umh.es, and @ucsantabarbara.bsky.social!
If you try to predict perception from stimulation parameters alone, you’re basically at chance.
But if you use neural responses, suddenly you can decode detection, brightness, and color with high accuracy.
If you try to predict perception from stimulation parameters alone, you’re basically at chance.
But if you use neural responses, suddenly you can decode detection, brightness, and color with high accuracy.
Yes ... but control breaks down the farther you stray from the brain’s natural manifold.
Still, our methods required lower currents and evoked more stable percepts.
Yes ... but control breaks down the farther you stray from the brain’s natural manifold.
Still, our methods required lower currents and evoked more stable percepts.
1️⃣ Gradient-based optimizer (precise, but slow)
2️⃣ Inverse neural net (fast, real-time)
Both shaped neural responses far better than conventional 1-to-1 mapping
1️⃣ Gradient-based optimizer (precise, but slow)
2️⃣ Inverse neural net (fast, real-time)
Both shaped neural responses far better than conventional 1-to-1 mapping
💡 Key insight: accounting for pre-stimulus activity drastically improved predictions across sessions.
This makes the model robust to day-to-day drift.
💡 Key insight: accounting for pre-stimulus activity drastically improved predictions across sessions.
This makes the model robust to day-to-day drift.
So we turned to 🧠 data: >6,000 stim-response pairs over 4 months in a blind volunteer, letting a model learn the rules from the data.
So we turned to 🧠 data: >6,000 stim-response pairs over 4 months in a blind volunteer, letting a model learn the rules from the data.