All posts public domain under CC0 1.0.
1. LLMs produce meaningful language
2. LLMs are not sentient at all
the problem before us is to understand how both are possible
Whereas opponents largely talk about the world we live in today, and its constraints.
youtu.be/b2F-DItXtZs
youtu.be/b2F-DItXtZs
via The Economist
We found embeddings like RoPE aid training but bottleneck long-sequence generalization. Our solution’s simple: treat them as a temporary training scaffold, not a permanent necessity.
arxiv.org/abs/2512.12167
pub.sakana.ai/DroPE
We found embeddings like RoPE aid training but bottleneck long-sequence generalization. Our solution’s simple: treat them as a temporary training scaffold, not a permanent necessity.
arxiv.org/abs/2512.12167
pub.sakana.ai/DroPE
1. CRON is inefficient
2. RLM (Recursive Language Models) are extraordinarily powerful
3. Every recursive algo can be implemented as a queue
4. I gave the agent a queue
alexzhang13.github.io/blog/2025/rlm/