Shira Ovide
shiraovide.bsky.social
Shira Ovide
@shiraovide.bsky.social
Washington Post technology writer. New Yorker. Bird watcher. Grumpy. Sign up for my (free) Tech Friend newsletter: https://www.washingtonpost.com/newsletters/the-tech-friend/
The megacap tech companies abusing the word "factory" aren't even consistent in their abuse.

In A.I., a "factory" sometimes means:

*An A.I.-specific massive data center.
*A package of hardware and software sold to big companies that want to optimize their use of A.I.

None of this is a factory.
December 2, 2025 at 5:21 PM
And Nvidia times infinity: wapo.st/42Q1RxE
Analysis | The hottest term in AI is completely made up
Artificial intelligence executives are hearkening back to the industrial revolution.
wapo.st
December 2, 2025 at 5:17 PM
I am sorry for your loss.
November 25, 2025 at 9:55 PM
Warning: This article will make you want to eat ALL THE CHEESESTEAKS. I'm sad now.
November 25, 2025 at 9:48 PM
Yup!
November 25, 2025 at 5:50 PM
idk, man, it hurts me to think about how software updates now regularly cause symptoms of grief for people who feel they've lost a trusted companion. www.nytimes.com/2025/11/23/t...
What OpenAI Did When ChatGPT Users Lost Touch With Reality
www.nytimes.com
November 25, 2025 at 5:47 PM
And if you can, my colleague @nitasha.bsky.social has written extensively about the engrossing AI companion apps and the horrible alleged fallout from them.

www.washingtonpost.com/technology/2...
A teen contemplating suicide turned to a chatbot. Is it liable for her death?
A lawsuit filed by the parents of 13-year-old Juliana Peralta against Character AI is the latest to allege a chatbot contributed to a teen’s death by suicide.
www.washingtonpost.com
November 25, 2025 at 5:43 PM
Character AI's message to teens is both empathetic and yikes. Basically, sorry you're losing your AI friends. Maybe read a book? support.character.ai/hc/en-us/art...
Resources You Can Turn To:
We understand that this is a significant change for you. We are deeply sorry that we have to eliminate a key feature of our product.We want to make sure you still have safe, creative, and supportiv...
support.character.ai
November 25, 2025 at 5:41 PM
Go read @georgiawells.bsky.social's new article with teens worrying about getting blocked from their AI friends.

One researcher told me that panic is a sign of how companion AI apps were designed for dependency. www.wsj.com/tech/ai/char...
Teens Are Saying Tearful Goodbyes to Their AI Companions
Chatbot maker Character.AI is cutting off access, citing mental-health concerns.
www.wsj.com
November 25, 2025 at 5:40 PM
I've heard repeatedly in my reporting this year surprise and optimism that we're grappling NOW with the downsides of engrossing AI.

It took so many years for similar public awareness and policy pushback to social media. (And that's still incomplete.)
November 25, 2025 at 5:38 PM
You're not alone if you doubt Character AI (or social media companies in Australia) will try very hard to keep teens out.

Character AI pitched one research account set up for a 14-year-old on a "Dirty-Minded Bestie" persona. This was last week.
November 25, 2025 at 5:36 PM