- replicate.com/wan-video/wa...
- replicate.com/wan-video/wa...
- replicate.com/prunaai/vace...
- replicate.com/prunaai/wan-...
- replicate.com/qwen/qwen-im...
- replicate.com/prunaai/hidr...
- replicate.com/prunaai/hidr...
- replicate.com/prunaai/hidr...
- replicate.com/prunaai/hidr...
- replicate.com/prunaai/flux...
- replicate.com/prunaai/flux...
- replicate.com/prunaai/flux...
Paper: arxiv.org/pdf/2402.15978
Code: github.com/fortuinlab/s...
Paper: arxiv.org/pdf/2402.15978
Code: github.com/fortuinlab/s...
- We train sparsifiable models via the Marginal Likelihood to make pruning easy after training ✅
- We construct parameter- & unit-wise prior that fit unstructured and structured pruning ✅
- We propose the cheap Optimal Posterior Damage score outperforming other pruning scores ✅
- We train sparsifiable models via the Marginal Likelihood to make pruning easy after training ✅
- We construct parameter- & unit-wise prior that fit unstructured and structured pruning ✅
- We propose the cheap Optimal Posterior Damage score outperforming other pruning scores ✅
@bertrand-sharp.bsky.social, and Stephan Günnemann
@bertrand-sharp.bsky.social, and Stephan Günnemann
Paper: openreview.net/pdf/b84db14f...
Overview: www.cs.cit.tum.de/daml/expecte...
4/4
Paper: openreview.net/pdf/b84db14f...
Overview: www.cs.cit.tum.de/daml/expecte...
4/4
- "Differentiable DAG Sampling", ICML 2022
- "End-to-end Learning of Probabilistic Hierarchies on Graphs", ICLR 2021
- "Tree Sampling Divergence", IJCAI 2019
- "Hierarchical graph clustering using node pair sampling" MLG, KDD 2018
3/4
- "Differentiable DAG Sampling", ICML 2022
- "End-to-end Learning of Probabilistic Hierarchies on Graphs", ICLR 2021
- "Tree Sampling Divergence", IJCAI 2019
- "Hierarchical graph clustering using node pair sampling" MLG, KDD 2018
3/4
- We optimize the expected objectives over probabilistic model on hierarchies. ✅
- We differentiably sample any hierarchy (not only binary ones!). ✅
- We show theoretical consistency and achieve SOTA results for hierarchical learning. ✅
2/4
- We optimize the expected objectives over probabilistic model on hierarchies. ✅
- We differentiably sample any hierarchy (not only binary ones!). ✅
- We show theoretical consistency and achieve SOTA results for hierarchical learning. ✅
2/4
Our journey continues as we grow our team of 𝘁𝗼𝗽 𝘁𝗮𝗹𝗲𝗻𝘁𝘀 𝗮𝗰𝗿𝗼𝘀𝘀 𝗘𝘂𝗿𝗼𝗽𝗲 🇪🇺, with hubs in Germany and France 🇩🇪🤝🇫🇷. Together, we’re enabling cutting-edge AI compression and deployment without compromising real-world performance.
Our journey continues as we grow our team of 𝘁𝗼𝗽 𝘁𝗮𝗹𝗲𝗻𝘁𝘀 𝗮𝗰𝗿𝗼𝘀𝘀 𝗘𝘂𝗿𝗼𝗽𝗲 🇪🇺, with hubs in Germany and France 🇩🇪🤝🇫🇷. Together, we’re enabling cutting-edge AI compression and deployment without compromising real-world performance.
What started as an idea in my PhD, became an incredible journey. Bringing together my brilliant co-founders -S. Günnemann, J. Rachwan, R. Nait Mazi- was a pivotal step for Pruna AI. In 1 year, we smashed thousands of AI models, making them 𝗰𝗵𝗲𝗮𝗽𝗲𝗿💲, 𝗳𝗮𝘀𝘁𝗲𝗿⚡, 𝘀𝗺𝗮𝗹𝗹𝗲𝗿💾, and 𝗴𝗿𝗲𝗲𝗻𝗲𝗿🌳.
What started as an idea in my PhD, became an incredible journey. Bringing together my brilliant co-founders -S. Günnemann, J. Rachwan, R. Nait Mazi- was a pivotal step for Pruna AI. In 1 year, we smashed thousands of AI models, making them 𝗰𝗵𝗲𝗮𝗽𝗲𝗿💲, 𝗳𝗮𝘀𝘁𝗲𝗿⚡, 𝘀𝗺𝗮𝗹𝗹𝗲𝗿💾, and 𝗴𝗿𝗲𝗲𝗻𝗲𝗿🌳.