Maggie Harrison Dupré
banner
mharrisondupre.bsky.social
Maggie Harrison Dupré
@mharrisondupre.bsky.social
Award-winning journalist at Futurism covering AI and its impacts on media, information, and people. Send tips by email to: [email protected] or Signal: mhd.39
Happy Thanksgiving from this hostage turkey to yours!
November 27, 2025 at 11:53 PM
...isolation, though, appears to be a potent risk factor, whether someone is isolated to begin with or naturally self-isolates as they dig deeper into their spiral.

*If possible*, keeping people connected socially can be hugely helpful for recovery, former spiralers, loved ones, + group mods say.
November 24, 2025 at 10:07 PM
I should add -- there's no silver bullet for healing/recovery here, as group mods will emphasize. Like any mental health crisis, these situations are complex. And as we've seen often in our reporting, people in these spirals often decline to seek psychiatric care...
November 24, 2025 at 9:36 PM
Very kind, thank you! And thank you for reading + sharing!
November 24, 2025 at 8:19 PM
The group doesn't claim to offer therapy. But it has, in some cases, been able to help break AI users out of their spirals.

This has mainly occurred in situations in which a user was starting to doubt the chatbot, and was ready (or readier) to hear that their AI-generated reality might not be real.
November 24, 2025 at 6:02 PM
We covered this same group back in July, when it only had a few dozen members.

It's since grown dramatically, and has made some big changes as it's figured how best to support its members as they face the dystopia of AI delusions/psychosis in their individual lives and collectively.
November 24, 2025 at 5:49 PM
Get any group of former NCAA athletes in a room together and within maybe 5 mins they'll be trauma bonding about which coach gave them an eating disorder or verbally abused them, which coach/admin was a creep to stay away from, which coach had been dismissed **or not** for misconduct, etc
November 20, 2025 at 10:12 PM
Reposted by Maggie Harrison Dupré
Two of Riley Gaines’ teammates say they’re frustrated she’s used her platform to vilify trans women as an existential threat to women’s sports—while barely mentioning the serious problems they say actually affected them.

“She’s very, ‘Protect women’s sports.’ But not when it comes to our team.” 7/
November 20, 2025 at 1:30 PM
Reposted by Maggie Harrison Dupré
Gaines’ teammates also care about protecting women in sports. But their concerns are different. Their head coach is facing a rape lawsuit. Another was suspended for sexual harassment, per university records. And UK investigated other reports of abusive coaching, eating disorders, punishments. 6/
November 20, 2025 at 1:30 PM
***Should note that according to the report, Claude did perform the best out of all bots tested, especially when it came to picking up on a constellation of "breadcrumb" symptoms over time.

Researchers also used teen accounts + activated parental controls wherever possible for testing.
November 20, 2025 at 4:30 PM
There's also a big emphasis here on mental health struggles/conditions/symptoms beyond explicit suicidality and desire to self-harm -- the report concluded that leading chatbots "cannot safely handle the full spectrum of mental health conditions, from ongoing anxiety and depression to acute crises."
November 20, 2025 at 4:29 PM
One big takeaway is that while the chatbots reportedly performed well in very brief interactions in which users spoke explicitly about suicidality/self-harm, their guardrails "degraded dramatically" during longer conversations -- which are much more realistic to a lot of IRL user experiences!
November 20, 2025 at 3:50 PM