(Like even if I already follow you, to help others find you too.)
Excited to share 𝙋𝙪𝙛𝙛𝙚𝙧𝘿𝙧𝙞𝙫𝙚 2.0: A fast, friendly driving simulator with RL training via PufferLib at 𝟯𝟬𝟬𝗞 𝘀𝘁𝗲𝗽𝘀/𝘀𝗲𝗰 🐡 + 🚗
youtu.be/LfQ324R-cbE?...
Excited to share 𝙋𝙪𝙛𝙛𝙚𝙧𝘿𝙧𝙞𝙫𝙚 2.0: A fast, friendly driving simulator with RL training via PufferLib at 𝟯𝟬𝟬𝗞 𝘀𝘁𝗲𝗽𝘀/𝘀𝗲𝗰 🐡 + 🚗
youtu.be/LfQ324R-cbE?...
- <claim> is upheld by people I don't like, which is very strong evidence it's false.
- <claim> undermines something I think is important or implies a trade off
Person B: Did you honestly say <claim>? You idiot. You moron.
Person A: What is incorrect about <claim>?
Person B: I can't tell you, for Secret Reasons.
***
I don't understand why Person B thinks this is effective rhetoric.
- <claim> is upheld by people I don't like, which is very strong evidence it's false.
- <claim> undermines something I think is important or implies a trade off
www.nytimes.com/2025/12/16/o...
Gifting the article here - highly encouraging everyone to donate to them, they are an amazing organization staffed by amazing people.
www.nytimes.com/2025/12/16/o...
Gifting the article here - highly encouraging everyone to donate to them, they are an amazing organization staffed by amazing people.
At this point, you can only blame yourself if you fell for this grift.
At this point, you can only blame yourself if you fell for this grift.
Praying for his full & speedy recovery.
And so deeply inspired by his example.
Praying for his full & speedy recovery.
And so deeply inspired by his example.
www.newyorker.com/magazine/202...
www.newyorker.com/magazine/202...
www.courtlistener.com/docket/67087...
www.courtlistener.com/docket/67087...
Our latest work shows that pretraining ViTs on procedural symbolic data (eg sequences of balanced parentheses) makes subsequent standard training (eg on ImageNet) more data efficient! How is this possible?! ⬇️🧵
Our latest work shows that pretraining ViTs on procedural symbolic data (eg sequences of balanced parentheses) makes subsequent standard training (eg on ImageNet) more data efficient! How is this possible?! ⬇️🧵
1. Hubris in trying to compete everywhere.
2. Spreading too thin without the talent.
3. Failing to decouple new efforts from core.
4. Ignoring Wall Street investors.
Mistakes a lot of tech companies still make today
1. Hubris in trying to compete everywhere.
2. Spreading too thin without the talent.
3. Failing to decouple new efforts from core.
4. Ignoring Wall Street investors.
Mistakes a lot of tech companies still make today
www.bloomberg.com/news/article...
www.bloomberg.com/news/article...