Leo Zhang
leoeleoleo1.bsky.social
Leo Zhang
@leoeleoleo1.bsky.social
PhD student at @oxfordstatistics.bsky.social working on generative models
In addition, as a consequence of the flexibility allowed by SymDiff, we report much lower computational costs compared to message-passing based EGNNs
(8/9)
March 4, 2025 at 3:31 PM
We show substantial improvements over EDM on QM9, GEOM-Drugs, and competitive performance with much more sophisticated recent baselines (which all use intrinsically equivariant architectures)
March 4, 2025 at 3:31 PM
This is where recent work from Cornish (2024) (arxiv.org/abs/2406.11814) comes in. This generalises all previous work and extends it to the stochastic case using category-theoretic arguments, under the name of "stochastic symmetrisation"
March 4, 2025 at 3:31 PM
As an alternative, people have proposed cannonicalisation/frame averaging/probabilistic symmetrisation which convert arbitrary networks to be equivariant through a learned group averaging
March 4, 2025 at 3:31 PM
In our new paper (accepted at ICLR), we propose the first framework for constructing equivariant diffusion models via symmetrisation

This allows us to ensure E(3)-equivariance with just highly scalable standard architectures such as Diffusion Transformers, instead of EGNNs, for molecular generation
March 4, 2025 at 3:31 PM