LaurieWired
@lauriewired.bsky.social
3.3K followers 1 following 750 posts
researcher @google; serial complexity unpacker; writing http://hypertextgarden.com ex @ msft & aerospace
Posts Media Videos Starter Packs
lauriewired.bsky.social
Check out the original paper here, it’s an interesting glimpse at how scientific GPU compute got some traction:
graphics.stanford.edu/papers/brook...
lauriewired.bsky.social
It’s a bit sad we don’t do (scientific) compute with textures much anymore.



With Brook, it was super simple to bind the compute stream and render the output.



Theoretically, if you mapped an LLM to Brook, you’d be able to visually see the intermediates outputted as textures.
lauriewired.bsky.social
BrookGPU (Stanford) is widely considered the birth of a pre-CUDA GPGPU framework.


Virtualizing CPU-style primitives; it hid a lot of graphical “weirdness”.


By extending C with stream, kernel, and reduction constructs, GPUs started to act more like a co-processor.
lauriewired.bsky.social
As you hit the more theoretical sides of Computer Science, you start to realize almost *anything* can produce useful compute.



You just have to get creative with how it’s stored.



The math might be stored in a weird box, but the representation is still valid.
lauriewired.bsky.social
GPU computing before CUDA was *weird*.


Memory primitives were graphics shaped, not computer science shaped.


Want to do math on an array? Store it as an RGBA texture.


Fragment Shader for processing. *Paint* the result in a big rectangle.
lauriewired.bsky.social
And before you say it’s antiquated, just remember.



You’re reading this on:

an operating system who’s system frameworks are C++

in a browser engine that is likely C++

rendered by a graphics subsystem that is, big surprise, C++

Young people need to know this stuff too.
lauriewired.bsky.social
Every year, CppCon has a “Back to Basics” Track, which they also upload to youtube. I highly recommend all of them.



Instead of being constrained to “old school” CS teaching logic, it’s a demonstration of fuller, modern capabilities.
lauriewired.bsky.social
This is fundamentally the problem with how C++ is currently taught.



The best way to “unlearn” a negative C++ bias, especially for students, is to look at how Modern C++ is actually done.
lauriewired.bsky.social
Imagine learning the fundamentals of carpentry, but for teaching reasons, an otherwise reputable brand is artificially constrained to hand tools.


Of course, the moment a student jumps into the real world, and experiences their first power tool, it will blow their mind!
lauriewired.bsky.social
Admittedly, professors are in a tough spot.



To teach the concept, you fundamentally have to constrain the scope of the language. Many schools choose C++ out of practicality.

Controversially, I think toy languages that *aren't* industry standards are better suited for this.
lauriewired.bsky.social
Colleges do a terrible job of teaching C++.



It’s not “C with Classes”. Injected into curriculums as a demonstration of early CS concepts, it leaves many with a sour taste.



Students later immediately fall in love with the first language that *doesn’t* feel that way.
lauriewired.bsky.social
My favorite programming burn is Bjarne Stroustrup was once (supposedly) asked what he thought of Java.



He said he doesn’t like to be negative about C++ Applications.
lauriewired.bsky.social


Being stuck in class with a crappy calculator is one of the few times a mass of students ever experience a super limited computing device.



Even if the motivations are just games, speedups, or trying to cheat; it’s not often you get that “compute camaraderie”.
lauriewired.bsky.social
In the modern era we don’t really have a Commodore 64, or even a Motorola 68000 equivalent.



You start with absurd amounts of power.
lauriewired.bsky.social
People will overlock the CPU 10x for the fun of it.


I had the (superior, heh) Casio FX-9860GII. Limited C support, but that thing flew with an OC.


It ate through batteries like mad, but that was part of the fun.
lauriewired.bsky.social
Graphing Calculators like the TI-83 receive a lot of hate for perceived obsoleteness.

Unironically, it’s one of the few times that younger generations experience a limited computing system.

The creativity from boredom, no phone, etc makes for some interesting programming.
lauriewired.bsky.social
Actually, later in the video, I talk about XcodeGhost, which was a very real malware that used some of these techniques, compromising a significant % of the iOS App store!
lauriewired.bsky.social
Open Source isn't going to help.

There's a way to invisibly compromise all software.

A perfect, self-replicating "sin" passed down for generations of compilers.

It's not just theoretical, and Ken Thompson showed us how.
lauriewired.bsky.social
There’s a good textbook on the subject from MIT press, as well as an associated programming language.

Go check it out!



www.cs.mun.ca/~banzhaf/AC-...
lauriewired.bsky.social

So what is it good for?



Basically, imagine you want a programmatic process, but at a bacterium / molecular level.



We can’t get CPUs down there…but you can *sorta* turn your python code into an AC or genetic circuit.



Thus, the computation occurs in-cell.



It’s an insanely wild concept.
lauriewired.bsky.social
It get’s even weirder.



You can *compile* Artificial Chemistries (ACs) to actual DNA molecules.



Thus, the logic of your fake universe can be executed by real chemical systems via strand displacement.
lauriewired.bsky.social
There’s no main() function, loops, if-then statements.



You literally dump a soup of artificial molecules into a virtual beaker.



Then, they collide according to the rules of your made-up world.
lauriewired.bsky.social
Artificial Chemistries (ACs) are the weirdest kind of “programming” you’ve never heard of.


Imagine being a chemist; but in an alternate-reality fanfiction where the elements that make up the world are wildly different.



Here’s how you write it.
lauriewired.bsky.social
Mobile Data bandwidth is another rough one. I do think it will +10x, but the spectrum is pretty crowded.



When performance is related to a thermodynamic, quantum, or perception limit (displays), tech progress is linear.