Alex Albaugh
alexalbaugh.com
Alex Albaugh
@alexalbaugh.com
Chemical engineer, sort of.
Assistant professor @ Wayne State
albaugh.eng.wayne.edu
We're sharing our latest work at AIChE 2025 and with pre-prints! Hear Ali on mechanically interlocked polymers on Mon. or read it (arxiv.org/abs/2506.13968), read our foray into catalysis (chemrxiv.org/engage/chemr...), or hear me on WSU's ML for ChE class on Tues. and molecular motors on Wed.
November 2, 2025 at 1:06 AM
Happy Battle of Lake Erie Day to those who celebrate.
September 10, 2025 at 2:03 PM
The car has a drag coefficient of 0.3, a cross-sectional area of 2.5 m^2, and weighs 4500 lbs. With this and some reasonable estimates for the density of air and rolling friction coefficient, at a steady 30 mph air resistance is 80 N (18 lbs-f) and rolling resistance is 200 N (45 lbs-f).
June 18, 2025 at 3:38 PM
Me at the beginning of the semester vs. me at the end of this semester.
April 30, 2025 at 12:58 AM
Is there anything in the middle?
April 21, 2025 at 4:42 PM
This evidence is from NYT's lawsuit against OpenAI for copyright infringement. Is ChatGPT just regurgitating copyrighted material with some randomness sprinkled on top? Kinda. Like any machine learning model, ChatGPT learns by example and its outputs are going to look like what it has seen before.
April 18, 2025 at 3:55 PM
What happens if you turn the temperature down on ChatGPT? The New York Times found out that it can reproduce their articles word for word. The most likely predictions for certain prompts are just copies of copyrighted material.
April 18, 2025 at 3:55 PM
They're the same*! By adjusting an LLM's temperature you adjust how it selects words. A high temperature LLM will select more broadly from the distribution, making stranger choices more likely. A low temperature LLM will usually just select the most likely word.
April 18, 2025 at 3:55 PM
Converting from activations in the LLM neural network to probabilities for the next word is done with a softmax function. The softmax has a temperature that adjusts the resulting probabilities. Compare the softmax function on the left with the Boltzmann distribution on the right.
April 18, 2025 at 3:55 PM
LLMs generate a probability distribution for what the next word should be. Then they make a randomly weighted selection from the options. This is why you won't get the same answer twice from ChatGPT if you ask it the same question. There's built-in randomness to make it feel like natural language.
April 18, 2025 at 3:55 PM
A large language model (greatly simplified) tries to predict the next word. And it does this over and over. So if you give it a prompt, it will try to predict the next word. And then the next one. And then the next one.
April 18, 2025 at 3:55 PM
The mathematical form of this probability is called the Boltzmann distribution.
April 18, 2025 at 3:55 PM
Say you've got a system that can be in states with different energies. Which state is the system most likely to be in? The state with the lowest energy. But the rest depends on temperature! Lower temperatures make other states less likely and higher temperature make other states more likely.
April 18, 2025 at 3:55 PM