Nicholas Guttenberg
ngutten.bsky.social
Nicholas Guttenberg
@ngutten.bsky.social
So I think the roads are roughly: either find a group of people to all validate each other with as individuals not as important cogs in a great machine, or overcome that cultural programming and find stable sources of internal meaning.
November 30, 2025 at 8:20 PM
Now I just get annoyed when I do something neat for myself and family members ask e.g. 'Is that novel? Could you sell it?' (even if it'd make 80% less per time than what I'm already paid to do). But it puts into perspective how deep 'you should seek external validation' is, culturally.
November 30, 2025 at 8:18 PM
It helps that pre-modern-AI I already got annoyed enough about e.g. worrying if someone else somewhere had a similar idea as me or scooped me on publication that I stopped worrying implicitly about my utility to the world and started thinking about what I wanted to do or be for myself.
November 30, 2025 at 8:16 PM
The classic one is 'I'll join and reform the system from within!', massively underestimating the inertia of said system or the vested interests in it. Another classic one is 'someone has to make the hard decisions' where it's driven more by ego than actual practicality.
November 29, 2025 at 5:40 PM
So one thing you could use is how a decision at a particular point in time that seemed to make sense or be forced can lead to someone getting stuck to and dragged down by a bunch of lame, cruel, childish morons. The protagonist is the one who find the third way, while the villain failed to spot it.
November 29, 2025 at 5:38 PM
Which I bring up here because cognition in the presence of communication, coordination, etc with others (including other cognitive architectures) could have some of that kind of weird blind spot.
November 28, 2025 at 9:55 PM
This matters a lot in prebiotic stuff like autocatalytic cycles, because e.g. a catalyst that helps its own replication by x% but helps the replication of a competing cycle by y% with y>x will end up in a kind of selective blind-spot since it drives itself extinct via competition.
November 28, 2025 at 9:54 PM
Well, at least for stuff where the direction of 'makes the organism replicate more successfully' is aligned with 'makes the particular feature continue to occur' - once you get stuff that depends on breaching the compartment of a single individual, the evolutionary dynamics can get funny.
November 28, 2025 at 9:54 PM
Yeah, a better temperature scaling and something that doesn't have the sqrt(L) sublinearity of diffusion would give a lot more room... The 'assume the computational unit is a homogeneous cube' thing I did was pretty harsh actually.
November 28, 2025 at 8:08 PM
I do think its kind of elegant if fundamentally physically embedded computation is just down to the rate you can get heat out versus the rate you can get information in.
November 28, 2025 at 7:44 PM
So in principle that would give hard bounds no matter how you're doing the (irreversible) computation. But the Landauer bound is much smaller than our actual costs per bit erased right now, so its resting on a precarious assumption. That said, almost everything depends on T...
November 28, 2025 at 7:41 PM
So the vague direction I was looking was to take seriously that the Landauer bound has a temperature scaling and assuming that despite being way over it, maybe other issues would still scale similarly. So the more power you generate at fixed cooling budget, the more expensive each op is.
November 28, 2025 at 7:41 PM
It'd be really funny if the absolute fastest computer we could possibly build (given no space constraints) would actually require transistors 10cm to a size or something like that.
November 28, 2025 at 6:50 PM
Hm, following this rabbit hole, I wonder if there's a sweet spot length scale corresponding to a maximum realizable dependent-computation (e.g. you need bit 1 before you can compute bit 2) clock speed for a given material due to thermal transport and signal transport limits scaling in opposite ways.
November 28, 2025 at 6:47 PM
It looks like there was a parallel to Moore's law - sort of - called Dennard scaling which said that you can scale transistors down without losing efficiency... which has since hit physical limits and broke. en.wikipedia.org/wiki/Dennard...
Dennard scaling - Wikipedia
en.wikipedia.org
November 28, 2025 at 6:09 PM
It's also interesting that the Landauer bound depends on temperature, so as you run hotter the same number of irreversible operations also costs you more. It feels like this design space should have been systematically mapped already - 0d, 1d, 2d, 3d, 4d scaling of computations.
November 28, 2025 at 6:03 PM
I wish I could find this again, but I recall a result that power consumption ~ clock speed^3, so more compute in fixed time is less efficient than the same calculation taking longer. I wonder if there's something like that for using fixed space, versus letting something sprawl?
November 28, 2025 at 6:00 PM
So for example, Irresponsible Captain Tylor... If they hadn't by the end suggested that, yes actually he's doing this silly-seeming but highly successful and perfectly timed stuff intentionally, then I would have been a lot more disappointed.
November 24, 2025 at 5:12 PM
See, if you do it that way where it's just accidental but turns out fine, it makes me lose interest. If it's intentional, with the anticipation that 'it will be a good decision' and via some insight that the reader has to then work to obtain for themselves, then that I like.
November 24, 2025 at 5:10 PM
I mean, here by construction on my part to hit 'yeah that's still the myth of Pandora, but with all the emotionally-motivated-but-bad decisions converted to actually-good-but-hard-to-see-and-uncomfortable decisions'. If you want more difference, I'm asking for kishotenketsu over dramatic convention.
November 24, 2025 at 4:54 PM
'In order to obtain hope, she had to navigate all of the evils in its way' has much more to it for me than 'look how foolish people are, we're all walking disasters, c'mon you know you'd do it too!'
November 24, 2025 at 4:43 PM
Or you tell a version of the story where Pandora was born into a world oppressed by gods and it turns out that stealing and releasing all of the evils is actually just a hard but fully intended trade-off in exchange for getting rid of the gods, and was actually the best option in a bad situation.
November 24, 2025 at 4:41 PM
Basically what I'm saying is that I'm pretty much done with the internal hazards part of that. They're not compelling to me anymore, just tedious.
November 24, 2025 at 4:38 PM
So in that case, those decisions may be true to the characters, but they can be emotionally not true to the reader - either at first, or even ever. So that's where the tension sits in those patterns - the reader has to come to terms with 'Wait, that's what they chose? How... and it worked?!'
November 24, 2025 at 3:56 PM