Chris Chapman
cchapman.bsky.social
Chris Chapman
@cchapman.bsky.social
UX researcher, psychologist. Author "Quantitative User Experience Research" (w/Rodden), "R | Python for Marketing Research and Analytics" (w/Feit & Schwarz). Previously 24 yrs @ Google, Amazon, Microsoft. Personal account.

Blog at https://quantuxblog.com
Reposted by Chris Chapman
It’s easy to criticize car co’s for programming vehicles to break traffic and parking laws. But human drivers do it constantly.

Two big differences:
🔹 Humans make context-specific decisions (i.e., is it an emergency?). Software doesn’t.
🔹 Humans are liable for their actions. Car co's often aren’t.
Who Gets a Ticket When a Waymo Does Wrong and Nobody Is in the Robotaxi to Cite?
A Waymo robotaxi pulled an illegal maneuver in front of the police, but with no one inside, who gets the ticket?
www.motortrend.com
December 19, 2025 at 1:22 PM
Will add (3) people are also largely able to understand & mitigate the violations of other people because they fall into a shared framework.

Those are often regional, like "speeding +10 is OK" or "everyone here double parks". Algorithms do not share in those predictable frameworks.
December 19, 2025 at 3:18 PM
Colombia, Guyana, and Brazil must be very surprised
December 17, 2025 at 12:27 AM
TBF they clarified that it is only a "subset" of users who have catastrophic problems.

Which then makes me wonder how much they know about sets.
December 13, 2025 at 12:15 AM
Reposted by Chris Chapman
In Japan, South Korea, Taiwan, Hong Kong, Singapore… masking is boring. Etiquette. Courtesy.
A cough? Pollution day? Crowded underground? Mask goes on. End of story.

No identity crisis. No tribal signalling. Just collective care. /2
(Family photo from SE Asia - man in mask=NBD)
December 11, 2025 at 9:56 AM
Reposted by Chris Chapman
This may sound strange, but many students either consciously or subconsciously do not see school (including college) as an opportunity to "learn." The transactional nature of the system is totalizing. Even students who don't want to outsource their learning to LLMs feel like maybe they should...
December 11, 2025 at 1:51 PM
Reposted by Chris Chapman
And so: "Our support for innovation and America’s leadership in AI does not extend to using our residents, especially children, as guinea pigs while AI companies experiment with new applications. Nor is our support for innovation an excuse for noncompliance with our laws." Boom.
December 11, 2025 at 1:09 PM