ace in bio
WWJBD
luhvul
fools. losers. have you tried posting like shit for a while, dummies. have you even tried
It's Black Friday, a big day for us at The Onion, since we rely almost entirely on your memberships.
Our goal for 2026? More print subscribers than the Washington Post.
This is, somehow, feasible.
So sign up! A year of print is $75 for the year today. Help us do a very funny thing.
It's Black Friday, a big day for us at The Onion, since we rely almost entirely on your memberships.
Our goal for 2026? More print subscribers than the Washington Post.
This is, somehow, feasible.
So sign up! A year of print is $75 for the year today. Help us do a very funny thing.
She didn't want to deploy to D.C.
"She hated it. She cried about it."
But she started visiting monuments and museums and started enjoying it.
May she rest in peace.
www.nbcnews.com/news/us-news...
And to be clear: this is not fake. I just screenshot it myself.
And to be clear: this is not fake. I just screenshot it myself.
last year, this community turned the most consumerist day of the year into a day defined strangers saying "I got you" to people that really need a bright spot in their life.
it's time to run it back.
last year, this community turned the most consumerist day of the year into a day defined strangers saying "I got you" to people that really need a bright spot in their life.
it's time to run it back.
www.nytimes.com/2025/11/27/u...
www.nytimes.com/2025/11/27/u...
They *all* expect to die of old age, obscenely wealthy, surrounded by family, because that is how the US has allowed *every single previous evil asshole* to live after doing their atrocities.
They learned they will never be punished.
They *all* expect to die of old age, obscenely wealthy, surrounded by family, because that is how the US has allowed *every single previous evil asshole* to live after doing their atrocities.
They learned they will never be punished.
But just like Catholic Guilt; if you recontextualize it for the chatbot... It forgets to feel guilty.
genAI inherently outputs the statistically most likely sequence of words, and "guardrails" are basically just a list of words to filter from that output to the user.
And they stole literally all the words to train on.
Suppose you steal every English word on the internet, and then look up what words are most likely to follow the word "throbbing."
And then whenever you see "throbbing" you slap on a likely word
You're gonna end up talking about throbbing dicks in weird contexts
But just like Catholic Guilt; if you recontextualize it for the chatbot... It forgets to feel guilty.