We also want to give a shout-out to [Biotite](www.biotite-python.org/latest/), which is the bedrock of our approach. Biotite has made our framework vastly more performant and flexible. We're excited for what's next!
We also want to give a shout-out to [Biotite](www.biotite-python.org/latest/), which is the bedrock of our approach. Biotite has made our framework vastly more performant and flexible. We're excited for what's next!
Frank DiMaio both directed and carried the team - he deserves the most recognition
Frank DiMaio both directed and carried the team - he deserves the most recognition
It's been the pleasure my life to work alongside @simonmathis.bsky.social, @rkrishna3.bsky.social, @kinasekid.bsky.social , and so many other unbelievably talented individuals on this project. And extra credit to @kdidi.bsky.social for jumping into the frenzy to bring this work across the line
It's been the pleasure my life to work alongside @simonmathis.bsky.social, @rkrishna3.bsky.social, @kinasekid.bsky.social , and so many other unbelievably talented individuals on this project. And extra credit to @kdidi.bsky.social for jumping into the frenzy to bring this work across the line
There's no reason every researcher in the BioML space re-invents the wheel ever time they train a structure-based model. The complexity of loading and annotating biomolecular data for machine learning applications should be done once, and done right — that was our goal with AtomWorks
There's no reason every researcher in the BioML space re-invents the wheel ever time they train a structure-based model. The complexity of loading and annotating biomolecular data for machine learning applications should be done once, and done right — that was our goal with AtomWorks
We're also thrilled to release AtomWorks, which we used as the foundation for not only RF3, but also RF2AA, LigandMPNN, ProteinMPNN, and a design model — all by just swapping out a handful of modular components we call Transforms (just like Torchdata, for those familiar)
We're also thrilled to release AtomWorks, which we used as the foundation for not only RF3, but also RF2AA, LigandMPNN, ProteinMPNN, and a design model — all by just swapping out a handful of modular components we call Transforms (just like Torchdata, for those familiar)
We're thrilled to share RF3 fully open-source with the community — it handles chirality better than any other model, it supports arbitrary atomic templating (which we included at train-time), and it narrows the open-source gap to AF3
We're thrilled to share RF3 fully open-source with the community — it handles chirality better than any other model, it supports arbitrary atomic templating (which we included at train-time), and it narrows the open-source gap to AF3