Lu-Chun Yeh
luchunyeh.bsky.social
Lu-Chun Yeh
@luchunyeh.bsky.social
Marie-Curie Postdoctoral Fellow with @dkaiserlab.bsky.social at Justus Liebig University Gießen | studying visual perception, attention using fMRI,M/EEG, and computational models.

https://sites.google.com/view/lu-chun-yeh
Reposted by Lu-Chun Yeh
🚨New Preprint!
How can we model natural scene representations in visual cortex? A solution is in active vision: predict the features of the next glimpse! arxiv.org/abs/2511.12715

+ @adriendoerig.bsky.social , @alexanderkroner.bsky.social , @carmenamme.bsky.social , @timkietzmann.bsky.social
🧵 1/14
Predicting upcoming visual features during eye movements yields scene representations aligned with human visual cortex
Scenes are complex, yet structured collections of parts, including objects and surfaces, that exhibit spatial and semantic relations to one another. An effective visual system therefore needs unified ...
arxiv.org
November 18, 2025 at 12:37 PM
Reposted by Lu-Chun Yeh
🚨🧠🚨 POSTDOC POSITION 🚨🧠🚨 in my lab: www.ru.nl/en/working-a... Connectivity in Working Memory. Deadline 12 Nov! Apply via website; please repost.
Postdoc Position: Neural Circuitry Underlying Working Memory | Radboud University
Do you want to work as a Postdoc Position: Neural Circuitry Underlying Working Memory at the Faculty of Social Sciences? Check our vacancy!
www.ru.nl
November 9, 2025 at 9:17 AM
Reposted by Lu-Chun Yeh
A few snapshots from this year’s Kaiser Lab retreat 🌿🧠✨
October 14, 2025 at 10:47 AM
Honored and thrilled to receive the GGN Early Career Award @jlugiessen.bsky.social 🏆 Huge thanks to my awesome supervisors Daniel @dkaiserlab.bsky.social & Marius @peelen.bsky.social , all my collaborators, and the amazing colleagues from both labs — couldn’t have done it without you! 💗
June 26, 2025 at 3:57 PM
Reposted by Lu-Chun Yeh
Don’t miss out again! If you are interested in our studies presented at @VSSMtg , you can find our posters here: 👇
drive.google.com/drive/mobile...
May 28, 2025 at 8:48 AM
Reposted by Lu-Chun Yeh
#VSS2025 was a blast - great science, fun people, beach vibes. We‘ll be back! @vssmtg.bsky.social
May 22, 2025 at 3:37 PM
How fast do we extract distance cues from scenes 🛤️🏞️🏜️ and integrate them with retinal size 🧸🐻 to infer real object size? Our new EEG study in Cortex has answers! w/ @dkaiserlab.bsky.social @suryagayet.bsky.social @peelen.bsky.social
Check it out: www.sciencedirect.com/science/arti...
The neural time course of size constancy in natural scenes
Accurate real-world size perception relies on size constancy, a mechanism that integrates an object's retinal size with distance information. The neur…
www.sciencedirect.com
May 17, 2025 at 12:21 PM
Reposted by Lu-Chun Yeh
We are at @vssmtg.bsky.social 2025. 🏖️☀️🍹come check out our studies!
May 15, 2025 at 9:04 AM
Glad to share our EEG study now is out in JNP. 🎉 We showed that alpha rhythms automatically interpolate occluded motion based on the surrounding contextual cues! 🚶‍♂️🏞️
@dkaiserlab.bsky.social

journals.physiology.org/doi/full/10....
Cortical alpha rhythms interpolate occluded motion from natural scene context | Journal of Neurophysiology | American Physiological Society
Tracking objects as they dynamically move in and out of sight is critical for parsing our everchanging real-world surroundings. Here, we explored how the interpolation of occluded object motion in nat...
journals.physiology.org
May 7, 2025 at 6:57 AM