Matthias Kellner
banner
matthiaskellner.bsky.social
Matthias Kellner
@matthiaskellner.bsky.social
PhD student in @labcosmo.bsky.social
What a cool applet - running a universal MLIP directly from your webbrowser!
November 30, 2025 at 7:12 PM
Reposted by Matthias Kellner
📢 PET-MAD is here! 📢 It has been for a while for those who read the #arXiv, but now you get it preciously 💸 typeset by @natcomms.nature.com Take home: unconstrained architecture + good train set choices give you fast, accurate and stable universal MLIP that just works™️ www.nature.com/articles/s41...
PET-MAD as a lightweight universal interatomic potential for advanced materials modeling - Nature Communications
PET-MAD is a fast and lightweight universal machine-learning potential, trained on a small but diverse dataset, that delivers near-quantum accuracy in atomistic simulations for both organic and inorga...
www.nature.com
November 28, 2025 at 8:36 AM
Reposted by Matthias Kellner
Anticipating 🧑‍🚀 Wei Bin's talk at #psik2025 (noon@roomA), 📢 a new #preprint using PET and the MAD dataset to train a universal #ml model for the density of states, giving band gaps for solids, clusters, surfaces and molecules with MAE ~200meV. Go to the talk, or check out arxiv.org/html/2508.17...!
August 28, 2025 at 7:19 AM
We're introducing ShiftML3, a new ShiftML model for chemical shielding predictions in organic solids.

* ShiftML3 predicts full chemical shielding tensors
* DFT accuracy for 1H, 13C, and 15N
* ASE integration
* GPU integration

Code: github.com/lab-cosmo/Sh...
Install from Pypi: pip install shiftml
GitHub - lab-cosmo/shiftml: A python package for the prediction of chemical shieldings of organic solids and beyond.
A python package for the prediction of chemical shieldings of organic solids and beyond. - lab-cosmo/shiftml
github.com
August 25, 2025 at 8:53 AM
Reposted by Matthias Kellner
🚨 #machinelearning for #compchem goodies from our 🧑‍🚀 team incoming! After years of work it's time to share. Go check arxiv.org/abs/2508.15704 and/or metatensor.org to learn about #metatensor and #metatomic. What they are, what they do, why you should use them for all of your atomistic ML projects 🔍.
August 22, 2025 at 7:40 AM
Reposted by Matthias Kellner
🎉 DFT-accurate, with built-in uncertainty quantification, providing chemical shielding anisotropy - ShiftML3.0 has it all! Building on a successful @nccr-marvel.bsky.social-funded collaboration with LRM🧲⚛️, it just landed on the arXiv arxiv.org/html/2506.13... and on pypi pypi.org/project/shif...
June 17, 2025 at 1:18 PM
Reposted by Matthias Kellner
When you combine #machinelearning and #compchem, you need to start worrying at the QM details within your ML architecture. We use our indirect Hamiltonian framework and pySCFAD to explore the enormous design space arxiv.org/abs/2504.01187
April 3, 2025 at 9:30 PM
Reposted by Matthias Kellner
📢 PET-MAD has just landed! 📢 What if I told you that you can match & improve the accuracy of other "universal" #machinelearning potentials training on fewer than 100k atomic structures? And be *faster* with an unconstrained architecture that is conservative with tiny symmetry breaking? Sounds like 🧑‍🚀
March 19, 2025 at 7:23 AM
Reposted by Matthias Kellner
Happy to share a new #cookbook recipe that shocases several new software developments in the lab, using the good ole' QTIP4P/f water model as an example. atomistic-cookbook.org/examples/wat.... TL;DR - you can now build torch-based interatomic potentials, export them and use them wherever you like!
February 28, 2025 at 12:58 PM
Feeling a bit lonely here ...
November 15, 2024 at 11:39 AM