Katelyn Mei
banner
katelynxmei.bsky.social
Katelyn Mei
@katelynxmei.bsky.social
PhD student in Information Science @uwischool.bsky.social | Human-AI interaction & Cognitive Psych | Middlebury College ‘22

katelynmei.com
Reposted by Katelyn Mei
🚀 We turned LabintheWild into a book! 📖
Digital Culture Shock = global insights + LabintheWild tests so you can explore your cultural background.

How does culture shape tech—cars, apps, websites, ChatGPT?

👉 Can’t wait to hear what you think!
Digital Culture Shock
How culture shapes the design and use of technology—and how we can resist the one-size-fits-all approach to technology design
press.princeton.edu
September 5, 2025 at 11:11 PM
Reposted by Katelyn Mei
What happens when a robotaxi from California tries to drive in Cairo? Is that website colorful or chaotic? And when did chatbots get so rude? In a new book, #UW prof @katharinareinecke.bsky.social explores how culture shapes tech use and design. press.princeton.edu/books/hardco... #HCI #AI #BookSky
Digital Culture Shock
How culture shapes the design and use of technology—and how we can resist the one-size-fits-all approach to technology design
press.princeton.edu
July 22, 2025 at 6:41 PM
Reposted by Katelyn Mei
🎙️ NEW WORK! @allisonkoe.bsky.social, Hilke Schellmann, Anna Seo Gyeong Choi, Katelyn Mei and I just released our stakeholder-grounded #AI #audit of speech-to-text transcription systems, examining the how well they work for people with #aphasia. More : arxiv.org/abs/2506.08846
Addressing Pitfalls in Auditing Practices of Automatic Speech Recognition Technologies: A Case Study of People with Aphasia
Automatic Speech Recognition (ASR) has transformed daily tasks from video transcription to workplace hiring. ASR systems' growing use warrants robust and standardized auditing approaches to ensure aut...
arxiv.org
June 12, 2025 at 12:43 PM
🚀 What is hallucination in AI systems? @annaseogyeongchoi.bsky.social and I recently explored this topic in depth, and our article has just been published in The Conversation! 🎉 theconversation.com/what-are-ai-...
What are AI hallucinations? Why AIs sometimes make things up
When AI systems try to bridge gaps in their training data, the results can be wildly off the mark: fabrications and non sequiturs researchers call hallucinations.
theconversation.com
March 24, 2025 at 3:30 PM