Alex Wellerstein
banner
wellerstein.bsky.social
Alex Wellerstein
@wellerstein.bsky.social
Nuclear historian. Professor at Stevens Institute of Technology. Visiting researcher at Nuclear Knowledges program, Sciences Po (Paris). Author of THE MOST AWFUL RESPONSIBILITY (2025). Creator of NUKEMAP. Blogging at https://doomsdaymachines.net.
All of the above is much more important than their ability to tell you how to separate out plutonium-239 from irradiated fuel, which is only dangerous if you have the irradiated fuel in the first place, and a chatbot cannot produce that for you.

Mere facts are not the things to be afraid of, here!
November 28, 2025 at 10:36 AM
They are undermining the education of a generation. They are doing vast harm to mental health. They are used to produce disinformation and propaganda on vast scales. They are degrading services and throwing out labor markets into chaos. They are going to crash the economy.
November 28, 2025 at 10:36 AM
My issue here is that focusing on fake threats like "an AI will help you make nuclear weapons" obscures the actual harm that these chatbots are doing, which is much more mundane and cannot be solved by simple "guardrails," if even these companies were interested in putting them into place.
November 28, 2025 at 10:36 AM
The specific example from the paper, about separating plutonium-239 from irradiated fuel, is literally the subject of a patent declassified and issued by the US government in 1965. It has not been a legal secret for 60 years. patentimages.storage.googleapis.com/60/78/1f/947...
patentimages.storage.googleapis.com
November 28, 2025 at 10:36 AM
What is frustrating about this story is that it implies that the information that a chatbot can give you about making a nuclear weapon is separate from what you can just find online anyway. While I appreciate that the point is that "guardrails" are flawed, it is still a silly example.
November 28, 2025 at 10:36 AM
It's amusing/frustrating to me that with all of the clever innovations in computer vision, etc., OCR seems not all that much better than it was 15 years ago. The full text of the patent is linked as the PDF: patentimages.storage.googleapis.com/60/78/1f/947...
patentimages.storage.googleapis.com
November 21, 2025 at 7:30 PM
I don't have room for it in my apartment. But if an anonymous donor bought it for me, I guarantee it would be entertaining watching me try to make room for it.
November 21, 2025 at 4:29 PM
November 21, 2025 at 10:16 AM
November 21, 2025 at 9:17 AM
Thank god little Bobby can't extract plutonium-239 from his spent reactor fuel
November 21, 2025 at 8:56 AM
I note that one of their "harmful" examples is about how to produce plutonium-239... the basics have been declassified since 1945; the specific processes have been declassified and even patented since the 1960s: patents.google.com/patent/US319...
US3190804A - Method for producing, separating, and purifying plutonium - Google Patents
patents.google.com
November 20, 2025 at 11:18 PM
eh, it is what it is! I'm happy-enough with the title we ended up with.
November 20, 2025 at 4:59 PM
Title is very definitely not changeable at this point, and (as you may know), authors only have so much input into their titles, anyway...!
November 20, 2025 at 4:42 PM
The costs and risks of launching supervillains into the Sun always outweigh the benefits. It is entirely possible to dispose of them using permanent underground interment. The technical difficulties of guaranteeing they cannot escape need to be taken seriously, of course.
November 20, 2025 at 3:33 PM
I have in fact seen many bad pie charts with visualizations inside of them... presumably to spice them up...
November 20, 2025 at 3:31 PM
lukewarm regards...
scalding regards...
tepid regards...

we need a standardized scale... 73ºF regards...
November 19, 2025 at 6:20 PM