Philipp Wacker
banner
phkwacker.bsky.social
Philipp Wacker
@phkwacker.bsky.social
Senior Lecturer in Statistics, University of Canterbury (NZ).
he/him
Bayesian stats, Applied Math, filtering & sampling
pls send me awesome vegan waffle recipes!
August 2, 2025 at 6:44 AM
The last few weeks I have been using our fantastic UC makerspace (with its 3d printers, soldering iron, and tools) to print the OpenFlexure Project Manual Microscope. Kudos to @openflexure.bsky.social for an amazing open source design. Looking forward to looking at pond water and earth critters.
July 29, 2025 at 7:46 AM
I found more!
July 17, 2025 at 9:42 PM
Next time I run out of symbols/letters for a paper, I will take my cue from the ridiculous stuff economists seem to be get away with. This is all from a single publication.
July 17, 2025 at 2:34 AM
What is the correct metre for the "woof, woof, woof, woof" part of "Who let the dogs out"? This has been on my mind too much recently, so here's my attempt. You're welcome, internet. (or am I wrong?)
June 3, 2025 at 6:14 AM
the sound that you're hearing is global maths research grinding to a halt
May 14, 2025 at 8:01 AM
New dating platform for tradies just dropped
May 13, 2025 at 8:23 PM
One of the most fun things about reading comic books in languages you are trying to learn is that you learn "comic speak", i.e. other languages equivalents of "kaboom" and the like.
April 27, 2025 at 5:06 AM
Finally time for baking over the holidays. Rye/wheat
April 20, 2025 at 9:50 AM
In fact, let's focus on this part: We should read this as something in the form f(C_1) - f(C_0) - Df(C_0)[C_1-C_0], where f(C) = ln(det(C)) (and C instead of Sigma). I believe this is also due to the connection of KL-divergence to Bregman divergences in general (en.wikipedia.org/wiki/Bregman...).
March 24, 2025 at 9:52 AM
This makes sense if we rewrite the expression a bit:
March 24, 2025 at 9:52 AM
Something cool I understood this week: The Kullback-Leibler divergence between two multivariate Gaussian distributions has a closed form solution. The term with the means somehow makes sense: Penalise a covariance-weighted squared distance between the respective means. But all these matrix terms?
March 24, 2025 at 9:52 AM
Ah, very interesting! Look at this: This is the output of $Y_i Y_i^\top$ $Y_i {Y_i}^\top$ (note the brackets)
March 23, 2025 at 11:11 PM
What's happening with the indices here? Why are they not aligned?
March 23, 2025 at 10:06 PM
Ominous. (Seen in Sydney)
March 13, 2025 at 9:15 AM
Same energy
February 24, 2025 at 7:11 AM
I (think I) finally figured out a Venn diagram for all these pesky set systems: sigma algebra, lambda-system, algebra, monotone class, pi-system!
February 6, 2025 at 9:53 AM
Don't look back, you'll trip over Michael Caine!
December 27, 2024 at 7:55 PM
More farm animal math!
December 18, 2024 at 10:58 PM
Bonus image
December 18, 2024 at 6:39 AM
I can't get over how these cats are drawn. Pure nightmare fuel
December 18, 2024 at 6:39 AM
Apparently, math conferences go like this: It's raining, everyone sits/stands in a puddle while some dude tries to convince everyone that pi=3.14, to the confusion of everyone, while notes taken dissolve in the puddle, letters and numbers floating away.
December 13, 2024 at 10:09 PM
Summarising: The three different models have different bias-variance decompositions. The constant model has considerable bias (red point cloud not centered around ground truth), but it has low variance (small spread). Order 5 has minimal bias, but high variance. Quadratic seems good compromise.
November 18, 2024 at 12:03 AM
And this is for 1000 independent models of polynomials, order 5.
November 18, 2024 at 12:03 AM
Here the same thing for 1000 independent "quadratic" models of type f(x) = b0 + b1*x+ b2*x^2.
November 18, 2024 at 12:03 AM