Diego del Alamo
banner
delalamo.xyz
Diego del Alamo
@delalamo.xyz
Computational protein engineering & synthetic biochemistry at Takeda
Opinions my own
https://linktr.ee/ddelalamo
Anyway, if anyone in academia has some spare time & compute, I'd love to see if the MD-trained NNs are any better than the vanilla models at multi-state prediction, as it wasn't tested in the paper (model weights have a non-commercial license so they're off-limits to me)
November 17, 2025 at 3:22 PM
Yeah somewhere in the supplement
November 16, 2025 at 10:02 PM
Oh awesome, do you have any ablations showing the contribution of this to refoldability? Chroma showed nearly 2x improvement with the random edges vs standard nearest neighbors
November 16, 2025 at 10:02 PM
Another one - PDB 8F2X chain H (3.5 Å resolution, shown in gold). Totally different authors. Seems like aromatics immediately before the conserved J-gene tryptophan throws off assignment in this area; 7czv has a phenylalanine before the tryptophan
October 9, 2025 at 12:51 PM
The worst offender I've found so far is PDB 5OMM chain C on Sabdab, which has many unassigned residues despite being 1.7 Å resolution. For example its C-terminal VTVSS starts at IMGT pos 115 instead of pos 124. Probably AbNum/ANARCI struggling w/ the gaps

opig.stats.ox.ac.uk/webapps/sabd...
October 4, 2025 at 10:28 PM
Reposted by Diego del Alamo
It's a very cool result but IMO there are caveats. Inference is (mostly) slower. There is existing work on faster models (e.g. MiniFold or protenix mini), and also existing work on ensemble prediction. I doubt this works without training on AFDB, which bakes in inductive bias from triangle layers
September 26, 2025 at 6:00 PM