Jin Ke
jinke.bsky.social
Jin Ke
@jinke.bsky.social
PhD student @YalePsychology studying computational cognitive neuroscience.
Was @UChicago @Peking University
https://jinke828.github.io
I'm also deeply grateful to those who offered insightful feedback along the way: the BLRB community at UChicago, joint CogNeuro meeting at Yale, @esfinn.bsky.social 's lab and many others. I’d like to thank Emma Megla and Wilma Bainbridge for sharing the aphantasia data! (10/10)
August 20, 2025 at 1:53 PM
HUGE thank you to everyone whose incredible efforts over the past three years made this project possible! @tchamberlain.bsky.social @hayoungsong.bsky.social @annacorriveau.bsky.social @zz112.bsky.social, Taysha Martinez, Laura Sams, Marvin Chun, @ycleong.bsky.social @monicarosenb.bsky.social (9/10)
August 20, 2025 at 1:53 PM
Together, we found that ongoing thoughts at rest are reflected in brain dynamics and these network patterns predict everyday cognition and experiences.

Our work underscores the crucial role of subjective in-scanner experiences in understanding functional brain organization and behavior. (8/10)
August 20, 2025 at 1:53 PM
Neuromarkers of these thoughts further generalized to HCP data (N=908), where decoded thought patterns predicted positive vs. negative trait-level individual differences measures. This suggests that links between rsFC and behavior might in part reflect differences in ongoing thoughts. (7/10)
August 20, 2025 at 1:53 PM
Moreover, the model predicting whether people are thinking in the form of images distinguished an aphantasic individual—who lacks visual imagery—from their otherwise identical twin. Data from academic.oup.com/cercor/artic.... (6/10)
August 20, 2025 at 1:53 PM
Thought models generalized beyond self-report, predicting non-introspective markers, such as pupil size, linguistic sentiment of speech and the strength of a sustained attention network (Rosenberg et al., 2016, 2020). (5/10)
August 20, 2025 at 1:53 PM
How are these thoughts related to resting-state functional connectivity (rsFC) patterns? We found that similarity in ongoing thoughts tracks similarity in rsFC patterns within and across individuals, and that both thought ratings and topics could be reliably decoded from rsFC (4/10)
August 20, 2025 at 1:53 PM
We observed a remarkable idiosyncrasy in ongoing thoughts between individuals and over time, both in terms of self-reported ratings as well as the content and topics of thoughts. (3/10)
August 20, 2025 at 1:53 PM
In our “annotated rest” task, 60 individuals rested, and verbally described and rated their ongoing thoughts after each 30-sec rest period. (2/10)
August 20, 2025 at 1:53 PM
To learn more about this dataset and the neural dynamics of narrative insight, check out our recent work (preprint below) led by the amazing @hayoungsong.bsky.social and chat with her on Saturday 1:50 PM - 3:00 PM! Poster ID: P3-B-30.

www.biorxiv.org/content/10.1...
April 24, 2025 at 9:00 PM
Many thanks to Janice Chen, @lukejchang.bsky.social and @asieh.bsky.social for open sourcing the movie datasets! Also wanted to give a shoutout to UChicago MRIRC for helping us collect the North by Northwest data. (9/9)
April 17, 2025 at 3:41 PM
We have made our model and analysis scripts publicly available to facilitate its use by other researchers in decoding moment-to-moment emotional arousal in novel datasets, providing a new tool to probe affective experience using fMRI. (8/9)

github.com/jinke828/Aff...
GitHub - jinke828/AffectPrediction
Contribute to jinke828/AffectPrediction development by creating an account on GitHub.
github.com
April 17, 2025 at 3:41 PM
In conclusion, our findings reveal a generalizable representation of emotional arousal embedded in patterns of dynamic functional connectivity, suggesting a common underlying neural signature of emotional arousal across individuals and situational contexts. (7/9)
April 17, 2025 at 3:41 PM
In contrast, using the same computational modeling approach, we were unable to find a generalizable neural representation of valence in functional connectivity. Null results are inherently difficult to interpret, but we offer several possible speculations in our paper. (6/9)
April 17, 2025 at 3:41 PM
The network generalized to two additional, novel movies, where model-predicted arousal time courses corresponded with the plot of each movie, suggesting a methodological tool for researchers who wish to obtain continuous measures of arousal without having to collect additional human ratings. (5/9)
April 17, 2025 at 3:41 PM
This generalizable arousal network is encoded in interactions between multiple large-scale functional networks including default mode network, dorsal attention network, ventral attention network and frontoparietal network. (4/9)
April 17, 2025 at 3:41 PM
We observed robust out-of-sample generalizability of the arousal models across movie datasets that were distinct in low-level features, characters, narratives, and genre, suggesting a situation-general neural representation of arousal. (3/9)
April 17, 2025 at 3:41 PM