"I don't want to talk about any of the politics of it, but the thought of leaving your team in the middle of a race for any reason other than a family emergency, really strikes me as weird."
"I don't want to talk about any of the politics of it, but the thought of leaving your team in the middle of a race for any reason other than a family emergency, really strikes me as weird."
This is not a joke. This is not normal.
Donald Trump isn't a strongman, he's a scared man. Illinois won’t be intimidated by a wannabe dictator.
This is not a joke. This is not normal.
Donald Trump isn't a strongman, he's a scared man. Illinois won’t be intimidated by a wannabe dictator.
www.washingtonpost.com/health/2025/...
Adding edtech doesn't necessarily save teachers time. A recent study found that LMSs sold to schools over the past decade+ as time-savers aren’t delivering on making teaching easier.
In a time of increasing overreach and hateful rhetoric, it's more important than ever to reaffirm our commitment to the rights and dignity of the LGBTQ+ community.
You have a home here always.
In a time of increasing overreach and hateful rhetoric, it's more important than ever to reaffirm our commitment to the rights and dignity of the LGBTQ+ community.
You have a home here always.
It's 'inclusivity', but only for characteristics people who vote for Trump care about.
www.acs.org/about/inclus...
It's 'inclusivity', but only for characteristics people who vote for Trump care about.
www.acs.org/about/inclus...
cdm16274.contentdm.oclc.org/digital/coll...
cdm16274.contentdm.oclc.org/digital/coll...
Just takes a smartphone, curiosity to experiment and a mindset to learn.
1. What DOGE and the Trump administration did to the American people today
2. What Democrats are doing in response
1. What DOGE and the Trump administration did to the American people today
2. What Democrats are doing in response
I knew LLMs hallucinate as a feature, not a bug, but this was the first time I really saw its own internal logic for the hallucination.
It wasn't trying to answer my question with true, attributable arguments. It was trying to make it *appear* like it was
I knew LLMs hallucinate as a feature, not a bug, but this was the first time I really saw its own internal logic for the hallucination.
It wasn't trying to answer my question with true, attributable arguments. It was trying to make it *appear* like it was
“I think they're being short-sighted, because fear does that to you.”
Silence gives consent. And no one should consent to this.
“I think they're being short-sighted, because fear does that to you.”