Viktor Zaverkin
banner
viktorzaverkin.bsky.social
Viktor Zaverkin
@viktorzaverkin.bsky.social
Research Scientist @ NEC Labs Europe, Ph.D. in Theoretical Chemistry @ SimTech & @unistuttgart.bsky.social, ML/DL for Chemistry & Materials Science
Moreover, simulation results are sensitive to training data composition.

E.g., water density predictions depend on whether NaCl-water clusters were in the training set: compare ICTP-LR(M) vs. ICTP-LR(M)*.

Legend: Solid green - ICTP-LR(M); dashed green - ICTP-LR(M)*.

14/
August 15, 2025 at 8:30 AM
In Trp-cage, simulations with explicit long-range electrostatics exhibit greater conformational variability.

However, the origin of these effects remains unclear without DFT-level simulations.

11/
August 15, 2025 at 8:30 AM
For Crambin, no significant differences are observed for the vibrational spectrum.

10/
August 15, 2025 at 8:30 AM
For Ala3, larger models better reproduce experimental J-couplings.

9/
August 15, 2025 at 8:30 AM
For water and NaCl-water mixtures:

- Larger models don't consistently outperform smaller ones
- Increasing model size doesn't yield systematic convergence
- Explicit electrostatics shifts density predictions from overestimation to underestimation, without consistent gains.

8/
August 15, 2025 at 8:30 AM
As expected, benchmark metrics (e.g., energy & force RMSEs) systematically improve with increasing model size and the inclusion of explicit long-range interactions.

6/
August 15, 2025 at 8:30 AM
🚨 New preprint: How well do universal ML potentials perform in biomolecular simulations under realistic conditions?

There's growing excitement around ML potentials trained on large datasets.
But do they deliver in simulations of biomolecular systems?

It’s not so clear. 🧵

1/
August 15, 2025 at 8:30 AM
📈My first PhD paper just reached 100 citations, which is a small but very special milestone for me!

Our paper introduces Gaussian moments as molecular descriptors and uses them to build ML potentials with an impressive balance between accuracy and computational efficiency.
June 25, 2025 at 2:42 PM
📣 Can we go beyond state-of-the-art message-passing models based on spherical tensors such as #MACE and #NequIP?

Our #NeurIPS2024 paper explores higher-rank irreducible Cartesian tensors to design equivariant #MLIPs.

Paper: arxiv.org/abs/2405.14253
Code: github.com/nec-research...
December 6, 2024 at 2:45 PM