Corey
banner
kh0rish.bsky.social
Corey
@kh0rish.bsky.social
Biologist, just fed up at this point because *gestures everywhere*
Pinned
So first, we gotta start digging into biochemistry, BUT, we have to treat biochemistry like the finite-state machinery that it is. And not the abstract mathematical concept, but the simple machines of physics (lever, pulley, screw, etc) to form active mechanisms like this lock:
Reposted by Corey
The term "thinking" does not refer to a conversation; a conversation is the *product* of human thought. Generating something that looks conversational does not imply that the generator "thinks" any more than printing an essay implies that a laser printer can write.
February 17, 2026 at 4:17 AM
These kids posting vids of themselves riding the school bus as it full steam hits speed bumps and they all go flying is some of the funniest shit ive seen in a while
February 17, 2026 at 4:40 AM
Reposted by Corey
"socialized collective knowledge sources" is the solution in all these cases

AI is a perverted way of taking a library of alexandria resource and grinding it up to allow individual people to still Behave and Interact like they are the ones driving all the activity, and "not relying on anyone"
February 16, 2026 at 3:37 PM
Reposted by Corey
AI guys insisting on the inevitability of the tech are so irritating. every comparison they make is to something whose utility was immediately apparent. "it does things faster and we can fix its mistakes later and that'll be the job now" does not compare to "metronome that doesn't need winding"
February 16, 2026 at 3:20 PM
I'm gonna stay proud of this silly ass video I made for an undergrad neuroscience assignment about vision. Thank you Miss Frizzle
February 16, 2026 at 6:22 PM
Vibe coder joins company peddling vibes? Sounds familiar
steipete.me/posts/2026/o...
February 16, 2026 at 3:25 PM
Hot take:
"Nature vs Nurture" is dumb because they arent all that separable. The outside environment, which includes the nurturing, flows in and out of you all the time and it affects gene expression regularly. You also alter the environment which will then change you again afterwards.
February 16, 2026 at 6:45 AM
Reposted by Corey
Just say one thing that a program you are writing *does*. One thing.
February 15, 2026 at 7:18 PM
Reposted by Corey
Remember this every time an "AI" enthusiast claims that these products no longer produce output that doesn't match reality—that the problem is "basically solved", or that "you just need to write better prompts". This is not a bug, it's literally a feature
OpenAI ”acknowledged in its own research that LLMs will always produce hallucinations due to fundamental mathematical constraints that cannot be solved through better engineering, marking a significant admission from one of the AI industry’s leading companies.”

You can’t trust chatbots.
OpenAI admits AI hallucinations are mathematically inevitable, not just engineering flaws
In a landmark study, OpenAI researchers reveal that large language models will always produce plausible but false outputs, even with perfect data, due to fundamental statistical and computational limi...
www.computerworld.com
February 15, 2026 at 9:52 PM
This is a pretty good book for seeing a lot of the variety and power of cellular activity, just be wary of explanations that use “information” metaphors because they muddy the picture and its rampant in books like this when they try to describe higher level phenomena
jonlieffmd.com/book/the-sec...
February 15, 2026 at 6:08 PM
Reposted by Corey
Thank you. I was wondering also. Of course he was the primary focus. 🙄 The DEEP desire on the right, left and center to prioritize white guys and give them a redemption arc is pathological
February 15, 2026 at 1:32 PM
Reposted by Corey
Honestly, I think things would get A LOT clearer if everyone involved described the systems in terms of what they actually do, rather than via these wishful mnemonics.
February 15, 2026 at 4:43 AM
It “snowed” one goddamn day weeks ago why the fuck is so much of it still here
February 15, 2026 at 3:02 PM
Techbros obsessed with mathematical realism will never grasp that the fundamental problem cells have to solve, the solutions themselves leading to intelligence/consciousness, is moving matter from one place to another. Stuff in and out of the cell, stuff around inside the cell, and the cell itself.
This is what happens when people think neurons are just doing a binary transmission of electricity around to "process information."
Meanwhile even lowly bacteria are making and deploying stuff like this:
February 14, 2026 at 11:50 PM
This is what happens when people think neurons are just doing a binary transmission of electricity around to "process information."
Meanwhile even lowly bacteria are making and deploying stuff like this:
February 14, 2026 at 8:37 PM
How is he both the only one they know but also the one ALL of them know? And they just regurgitate his work while throwing in some math/compsci language
I blame Daniel Dennett's empty bloviating on this topic, since he's invariably the only philosopher techbros have ever read on the topic. None of them can articulate an actual argument. They always end up appealing to undefinable "emergent" hand-waving. Computation is not consciousness, period.
February 14, 2026 at 4:53 PM
Delightful Derek
February 14, 2026 at 3:00 PM
I just saw a chicken taking care of puppies I dont want to hear a goddamn thing about artificial intelligence in chatbots
February 13, 2026 at 7:39 PM
Reposted by Corey
It's always cool to see "intelligence" equated with believable language use when organismic intelligence (the only kind that matters) in every case except for *1* does not involve any use of language at all.
February 12, 2026 at 5:17 PM
Blows my mind how much people are still trying to use a statistical model of word proximity to do things they would NEVER try to do with something explicitly stated to be a statistical model of proximity.
February 10, 2026 at 4:22 PM
He can't be serious.
Embarrassing.
It was this post, right?
February 10, 2026 at 4:15 PM
Reposted by Corey
It took me longer than it should have to realize that tech companies hire people with advanced degrees in part to give a patina of scientistic rigor to their shady practices. Many of these same companies donated to Trump's inauguration fund, donating to the admin's attack on education and research
“She compares her work to the efforts of a parent raising a child. She’s training Claude to detect the difference between right and wrong while imbuing it with unique personality traits.”
This Philosopher Is Teaching AI to Have Morals
The tech company has entrusted the philosopher to endow its chatbot with a sense of right and wrong.
www.wsj.com
February 9, 2026 at 6:42 PM
Reposted by Corey
AI "impedes [theory because we're] interested in human-understandable theory and theory-based models, not statistical models which provide only a representation of the data. Scientific theories and models are only useful if [we understand them and] they connect transparently to research questions."
Not directly relevant but medicalisation is also a strategy, perhaps useful see bsky.app/profile/oliv...
February 8, 2026 at 7:33 AM
Reposted by Corey
No, brain is nothing like a computer, or to put it other way, a computer is nothing like a brain.

A lot to #Unlearn from 20th-century thinking.

"Smuggles confidence into biology, where change is slower, messier and often incomplete"
aeon.co/essays/what-...
What the metaphor of ‘rewiring’ gets wrong about neuroplasticity | Aeon Essays
The metaphor of rewiring offers an ideal of engineered precision. But the brain is more like a forest than a circuit board
aeon.co
February 7, 2026 at 3:04 PM
It says everything really
Yes, but it is kind of saying something that a human on the other side of the world, with absolutely no additional input but the same sensors and data the AI has access to on a screen, solves problem the AI does not.
February 6, 2026 at 4:08 PM