Yutong Zhang
banner
yutongzhang.bsky.social
Yutong Zhang
@yutongzhang.bsky.social
CS master student at Stanford | previously undergrad at UIUC
Huge thanks to my amazing coauthors (@dorazhao.bsky.social, Jeffrey T. Hancock, Robert Kraut, @diyiyang.bsky.social), I couldn’t have done this without you. Grateful to learn from and work with you all :)
June 18, 2025 at 4:33 PM
We need to ask how to weave AI into our social worlds in ways that augment (rather than erode) human connection. For more details, check out our preprint here: arxiv.org/abs/2506.12605
The Rise of AI Companions: How Human-Chatbot Relationships Influence Well-Being
As large language models (LLMs)-enhanced chatbots grow increasingly expressive and socially responsive, many users are beginning to form companionship-like bonds with them, particularly with simulated...
arxiv.org
June 18, 2025 at 4:27 PM
⚠️ Short-term comfort from a chatbot may cost long-term social health. Users report emotional dependence, withdrawal, and distorted expectations.

The illusion of care comes with real risks. Future designs must ask: How do we prevent over-attachment? How do we support healthier AI chatbots?
June 18, 2025 at 4:27 PM
5️⃣ Chatbot effects depend on your social environment – people with fewer real-life connections get less out of AI companionship, which doesn't make up for missing human support 🫂.
June 18, 2025 at 4:27 PM
3️⃣ It’s all about how chatbots are used: general interaction links to greater well-being 📈, but seeking companionship is tied to worse outcomes 📉.
4️⃣ More frequent interactions, deeper emotional connections, and more disclosure with AI companions are linked to lower well-being 📉.
June 18, 2025 at 4:27 PM
1️⃣ People with less human support, like single users, minority groups, and those with smaller social networks, are more likely to seek companionship from chatbots 💬.
2️⃣ The longer people chat, the more intense the relationship with AI 🔁.
June 18, 2025 at 4:27 PM
People don’t always say they use chatbots for companionship, but companionship remains the primary actual use across all three data sources. Users view chatbots as friends or partners and are turning to them to discuss emotional and intimate topics.
June 18, 2025 at 4:27 PM
📊 We surveyed 1,000+ Character.AI users and analyzed 4,363 chat sessions to understand how people really talk to AI. We combined three data sources to reveal how people connect with AI companions and how it impacts their well-being.
June 18, 2025 at 4:27 PM
This raises an urgent question: Can these “artificial” bonds truly meet human needs, or are we creating new vulnerabilities?
June 18, 2025 at 4:27 PM