Edward Kmett
banner
kmett.ai
Edward Kmett
@kmett.ai
Founder/Chief Scientist @ positron.ai

I like to write about Haskell, category theory, AI, and safety, and a whole lot of low level SIMD stuff for some reason

http://calendly.com/ekmett
http://github.com/ekmett
http://x.com/kmett
http://comonad.com/reader
To celebrate that writeup, the folks in the lab put together a schlocky little promo spot, which I couldn't resist sharing here. [No production cards were harmed during the filming of this video.]

Apologies for the twitter link, but it was over the local 3 minute limit.

x.com/kmett/status...
Edward Kmett on X: "GPUs made training massive models possible, but inference needs better memory capacity, memory bandwidth utilization, more power efficiency, and an architecture built bottom up with transformers in mind. To that end, I'm excited to share that Positron just raised a $51.6M Series https://t.co/5NjFMjZ9FP" / X
GPUs made training massive models possible, but inference needs better memory capacity, memory bandwidth utilization, more power efficiency, and an architecture built bottom up with transformers in mind. To that end, I'm excited to share that Positron just raised a $51.6M Series https://t.co/5NjFMjZ9FP
x.com
July 28, 2025 at 5:39 PM
I'm proud of what the Positron AI team has accomplished and I'm grateful to our investors Valor Equity Partners, Atreides Management, and DFJ Growth for backing us!
July 28, 2025 at 5:26 PM
We offer solutions that are already deployed and are shipping today, and we've designed new silicon to reshape AI inference—reducing cost, drastically cutting power consumption, and unlocking entirely new capabilities in production AI environments.
July 28, 2025 at 5:26 PM
Not sure any of that was useful but that was pretty much everything I had that came to mind offhand.
February 1, 2025 at 8:46 PM
Another more traditional tool for this sort of thing where information flows both ways is to use an attribute grammar and synthesize to pull information up and inherit to push information down through a syntax tree.
February 1, 2025 at 8:46 PM
When constructing ZK proofs for a lazy language, I also wind up in something like this situation, where I might squirrel away info about what thunks are going to be used in a pre-run, so I can rearrange calculations to the let binding site of the thunks I actually need, rather than the case site.
February 1, 2025 at 8:46 PM
Another way to think of this is in terms of modalities. The Tardis gives you access to both forward and backwards facing information about contraction and weakening, but its use here ties this to situations where you've committed hard to your inputs, not like a traditional typechecker time pass.
February 1, 2025 at 8:46 PM
mail.haskell.org/pipermail/ha... was me doing something kind of similar using Codensity STM rather than Tardis to look forward into the future to determine if a reference was used.
[Haskell-cafe] Explicit garbage collection
mail.haskell.org
February 1, 2025 at 8:46 PM
One-bit-reference counts can be refreshed by a custom GC, and ensure a tight analysis, and I've used it to patch up a shoddy type-checking-time uniqueness check. OTOH you seem to be doing this after being given user values, so you kind of muddy the waters from how I usually see it done.
February 1, 2025 at 8:46 PM
I don't really like the precision I usually see with uniqueness analysis-done-at-compile type for instance, and that looks _kind_ of like what you are doing. (caveats later)
February 1, 2025 at 8:46 PM
I have a scattershot of thoughts, and it's hard to fit them into a single message here, so I apologize if I wander. Still trying to learn how to use the multi-comment editing features on bluesky, so I might make a right hash of THAT as well. Here goes:
February 1, 2025 at 8:46 PM
You are looking at a particular bit of information (its generating function) which you can extract from an arbitrary Joyal-style combinatorial species. Most models of ADTs limit themselves to a nicely closed subset of species that generate merely ordinary generating functions, though.
January 14, 2025 at 7:53 AM
You're basically describing (finite) data structures in terms of their associated ordinary generating functions.

comonad.com/reader/2008/...

You can define sums, products, etc. this way.

If you go to exponential generating functions, you can also include bags, and cycles.
The Comonad.Reader » Generatingfunctorology
comonad.com
January 14, 2025 at 7:51 AM
They clearly named this animal house for John Belusk^Hhi.
November 24, 2024 at 7:02 PM
We retired the evil mangler, so at least that perl isn't touching every chunk of assembly produced by the compiler any more!
November 16, 2024 at 5:55 AM
*@kmett.ai now
November 16, 2024 at 3:22 AM
There'a a lot of neat tricks buried in there.
November 15, 2024 at 10:32 PM
I generally agree on the mutable variable advice. That said, Adam Gundry's PhD dissertation really showed what is possible with the alternative:

adam.gundry.co.uk/pub/thesis/

In particular he makes good use of explicit contexts to carefully track issues that arise as types get more dependent.
Type Inference, Haskell and Dependent Types
adam.gundry.co.uk
November 15, 2024 at 10:19 PM
It acted as a forcing function for me to register the domain, \@comonad.com and \@kmettgroup.com seemed like odd choices of handles, and my other domains are just weird. Too bad I let kmett.com go by accident a decade or two ago.
The Comonad.Reader
comonad.com
November 15, 2024 at 10:12 PM
They wouldn't call it a standing desk if you weren't supposed to stand on it.
a man in a suit is standing in a dark room and saying captain ! my captain !
ALT: a man in a suit is standing in a dark room and saying captain ! my captain !
media.tenor.com
November 15, 2024 at 8:29 PM
In theory we _should_ be able to come up with some kind of low cost dwarf-compatible way to dump this info. It is accessed rarely. The demand-based stack is there, even though the stack isn't a "c stack". It's just the logical "syntactic" call stack that needs maintenance via something lighter.
November 15, 2024 at 8:20 PM
I don't know how to do this with low performance cost. Do you?

When you turn on enough profiling you get some support out of HasCallStack regardless, but tracing that changes runtime behavior non-trivially and incurs costs.

I complain about this too, but don't have a viable alternative to offer.
November 15, 2024 at 7:38 PM