🏳️🌈✡️ he/him
https://ryanzfriedman.com/
The manuscript itself is also restructured. Figs 2 and 4 are swapped, there's a 5th fig for the K562 analysis, and we reworked the Discussion.
Apologies if threading isn't the way to go on Bsky. 🧬🔄
8/8
The manuscript itself is also restructured. Figs 2 and 4 are swapped, there's a 5th fig for the K562 analysis, and we reworked the Discussion.
Apologies if threading isn't the way to go on Bsky. 🧬🔄
8/8
These results show our model learns the context that distinguishes functionally non-equivalent motifs.
7/
These results show our model learns the context that distinguishes functionally non-equivalent motifs.
7/
Along with our other results, this shows active learning generates the data needed to learn regulatory grammars.
6/
Along with our other results, this shows active learning generates the data needed to learn regulatory grammars.
6/
5/
5/
4/
4/
This demonstrates that active learning is broadly effective and illustrate that enriching for active sequences is more informative
3/
This demonstrates that active learning is broadly effective and illustrate that enriching for active sequences is more informative
3/
2/
2/