There's growing excitement around ML potentials trained on large datasets.
But do they deliver in simulations of biomolecular systems?
It’s not so clear. 🧵
1/
Let’s chat about ML potentials, uncertainty quantification (ensemble-free, gradient-based: Laplace approx., NTKs, batch selection, …), uncertainty-biased MD, message-passing architectures, particle-mesh long-range methods, etc.
Let’s chat about ML potentials, uncertainty quantification (ensemble-free, gradient-based: Laplace approx., NTKs, batch selection, …), uncertainty-biased MD, message-passing architectures, particle-mesh long-range methods, etc.
There's growing excitement around ML potentials trained on large datasets.
But do they deliver in simulations of biomolecular systems?
It’s not so clear. 🧵
1/
There's growing excitement around ML potentials trained on large datasets.
But do they deliver in simulations of biomolecular systems?
It’s not so clear. 🧵
1/
Our paper introduces Gaussian moments as molecular descriptors and uses them to build ML potentials with an impressive balance between accuracy and computational efficiency.
Our paper introduces Gaussian moments as molecular descriptors and uses them to build ML potentials with an impressive balance between accuracy and computational efficiency.
The application portal for the English-conducted M.Sc. Chemical Sciences at @unistuttgart.bsky.social are officially open! 🔬✨
👉 Visit our program website for further details: www.uni-stuttgart.de/en/study/stu...
The application portal for the English-conducted M.Sc. Chemical Sciences at @unistuttgart.bsky.social are officially open! 🔬✨
👉 Visit our program website for further details: www.uni-stuttgart.de/en/study/stu...
In our NeurIPS 2024 paper, we introduce RealMLP, a NN with improvements in all areas and meta-learned default parameters.
Some insights about RealMLP and other models on large benchmarks (>200 datasets): 🧵
Higher-Rank Irreducible Cartesian Tensors for Equivariant Message Passing
🗓️ When: Wed, Dec 11, 11 a.m. – 2 p.m. PST
📍 Where: East Exhibit Hall A-C, Poster #4107
#MachineLearning #InteratomicPotentials #Equivariance #GraphNeuralNetworks
Our #NeurIPS2024 paper explores higher-rank irreducible Cartesian tensors to design equivariant #MLIPs.
Paper: arxiv.org/abs/2405.14253
Code: github.com/nec-research...
Higher-Rank Irreducible Cartesian Tensors for Equivariant Message Passing
🗓️ When: Wed, Dec 11, 11 a.m. – 2 p.m. PST
📍 Where: East Exhibit Hall A-C, Poster #4107
#MachineLearning #InteratomicPotentials #Equivariance #GraphNeuralNetworks
Folks from deep learning (I am guilty too): super excited about methods, modelling, make it look mathy, our number is bold in the table that everyone re-uses for 5 years...
Downstream users of tools in drug discovery: Oh god - is that how you evaluated?
*Surprised Pikachu meme*
it's so true and hits so hard:
Folks from deep learning (I am guilty too): super excited about methods, modelling, make it look mathy, our number is bold in the table that everyone re-uses for 5 years...
Downstream users of tools in drug discovery: Oh god - is that how you evaluated?
*Surprised Pikachu meme*
Our #NeurIPS2024 paper explores higher-rank irreducible Cartesian tensors to design equivariant #MLIPs.
Paper: arxiv.org/abs/2405.14253
Code: github.com/nec-research...
Our #NeurIPS2024 paper explores higher-rank irreducible Cartesian tensors to design equivariant #MLIPs.
Paper: arxiv.org/abs/2405.14253
Code: github.com/nec-research...