Leo Zhang
leoeleoleo1.bsky.social
Leo Zhang
@leoeleoleo1.bsky.social
PhD student at @oxfordstatistics.bsky.social working on generative models
Pinned
In our new paper (accepted at ICLR), we propose the first framework for constructing equivariant diffusion models via symmetrisation

This allows us to ensure E(3)-equivariance with just highly scalable standard architectures such as Diffusion Transformers, instead of EGNNs, for molecular generation
Reposted by Leo Zhang
A meta-point of this paper is that category theory has utility for reasoning about current problems of interest in mainstream machine learning. The theory is predictive, not just descriptive. 🧵(1/6)
In our new paper (accepted at ICLR), we propose the first framework for constructing equivariant diffusion models via symmetrisation

This allows us to ensure E(3)-equivariance with just highly scalable standard architectures such as Diffusion Transformers, instead of EGNNs, for molecular generation
March 6, 2025 at 4:38 AM
In our new paper (accepted at ICLR), we propose the first framework for constructing equivariant diffusion models via symmetrisation

This allows us to ensure E(3)-equivariance with just highly scalable standard architectures such as Diffusion Transformers, instead of EGNNs, for molecular generation
March 4, 2025 at 3:31 PM