Han Zhang
hanzhang.bsky.social
Han Zhang
@hanzhang.bsky.social
Assistant Research Scientist at the University of Michigan Transportation Research Institute. Attention, Visual Perception, Driver Behavior. https://hanzhang.cv/
trained my first CNN today. am I an AI researcher now? 😂
November 27, 2025 at 12:34 AM
Including driving variables improved out-of-sample prediction performance above a demographic-only model, most notably for physical health, life satisfaction, and social role constraints
October 31, 2025 at 2:22 PM
Some driving behaviors are highly specific in their predictions. E.g., longer minutes per trip more strongly predicted physical decline than anything else.
October 31, 2025 at 2:22 PM
Everyday driving patterns from 2,658 older adults across the US were recorded. They also completed self-reported measures of general well-being and health across cognitive, physical, social, and mental domains. We found widespread associations between driving behaviors and health:
October 31, 2025 at 2:22 PM
Personal news: Today is my first day as an Assistant Research Scientist at the University of Michigan Transportation Research Institute. Excited to apply my expertise in attention and visual perception to an important real-world issue -- driving safety! #GoBlue
June 16, 2025 at 3:05 PM
May 16, 2025 at 1:10 AM
Why do abrupt visual onsets capture our attention? A common assumption is that they are highly salient. Here, we show that this assumption is flawed. Higher physical salience made a distractor easier to ignore, whereas a less salient onset distractor captured attention. doi.org/10.1037/xhp0...
April 17, 2025 at 3:06 PM
An AOI drawer to flexibly define AOIs (6/8)
March 11, 2025 at 5:35 PM
A Fixation Viewer to visualize fixation patterns (5/8)
March 11, 2025 at 5:35 PM
In addition to its very intuitive API, PupEyes offers interactive tools such as a Pupil Viewer to inspect pupil traces (4/8)
March 11, 2025 at 5:35 PM
Doing science is hard these days, but here’s our new review on attentional capture by abrupt onsets, published in @jephpp.bsky.social ! We hope to offer some food for thought for both newcomers and experienced researchers alike. Hope you enjoy it! psycnet.apa.org/record/2025-...
March 4, 2025 at 4:00 PM
A computational model explains these data as two priority signals with different latencies and impacts on behavior. A distractor signal can be suppressed when the target signal is ready or, alternatively, when the distractor signal itself exerts less direct impact on behavior.
October 4, 2024 at 7:31 PM
This allowed us to track the entire spatiotemporal dynamics of visual attention with respect to certain stimuli. E.g.,
October 4, 2024 at 7:31 PM
To address this, we applied a "forced-response" method to saccadic behavior. This method forces saccade execution while systematically varying stimulus processing time, essentially forcing subjects to express their current attentional priority.
October 4, 2024 at 7:31 PM
Visual attention is often thought to be guided by a latent "priority map". But this priority map also evolves over time. The deployment of attention (e.g., an eye movement) reflects only a snapshot of this evolution over time and space.
October 4, 2024 at 7:30 PM