Alvaro Bartolome
alvarobartt.bsky.social
Alvaro Bartolome
@alvarobartt.bsky.social
machine learning @hf.co
Right, the point is that on Rust you end up "refactoring" a lot (at least I do), but seems easier to handle, whilst on Zig I don't feel is as easy, not especially complex either, just more cumbersome
January 31, 2025 at 4:05 PM
Check DeepSeek-R1 collection on the Hugging Face Hub, with not just DeepSeek-R1 and DeepSeek-R1-Zero, but also distilled their reasoning patterns to fine-tune smaller models!

huggingface.co/collections/...
DeepSeek-R1 - a deepseek-ai Collection
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
huggingface.co
January 23, 2025 at 1:49 PM
because it's my native language, anyway it was just an idea, not sure I'll do it anyway 🤗
November 28, 2024 at 8:42 AM
awesome 🤗
November 20, 2024 at 9:40 AM
how do I get in there? 🤗
November 20, 2024 at 9:39 AM
Read more about the Serverless Inference API in the documentation!

https://huggingface.co/docs/api-inference
November 19, 2024 at 4:15 PM
🔥 Finally, if you are willing to get started quickly and experiment with LLMs feel free to give the recently released Inference Playground a try!

https://huggingface.co/playground
November 19, 2024 at 4:15 PM
👨‍💻 Alternatively, you can also use the Serverless Inference API programmatically via cURL, the huggingface_hub Python SDK, the openai SDK for chat completion, and much more!

Find all the alternatives at https://huggingface.co/docs/api-inference
November 19, 2024 at 4:15 PM
🔎 Now let's explore some of the different alternatives to run inference via the Serverless API!

The most straightforward one is via the Hugging Face Hub available on the model card of the Serverless API supported models!
November 19, 2024 at 4:15 PM
🔒 Before going on, you will first need to generate a Hugging Face fine-grained token with access to the Serverless API, as the requests need to be authenticated so keep the token safe and avoid exposing it!
November 19, 2024 at 4:15 PM