#LocalLlm
folks on reddit fine-tuning deepseek OCR for persian www.reddit.com/r/LocalLLM/c..., seems not to be an out-of-the-box functional type thing though
November 7, 2025 at 3:09 AM Everybody can reply
1 likes
#ai #llm #localllama #localllm #grownostr #nostr #gfy
Not your own locally hosted LLM? You are giving away your thoughts and ideas to corporations/governments, who pay for what and how you think.
November 6, 2025 at 10:53 PM Everybody can reply
Was expecting more of an unboxing experience from the #Nvidia RTX PRO 6000 Blackwell Max-Q, considering the price. But really, I'm just interested in the 96GB VRAM! Unfortunately, I'll need one more #AI/#ML client before I can afford to throw a second one in my new #Lenovo P8. #LocalLLM #GameDev
October 27, 2025 at 6:54 PM Everybody can reply
1 reposts 7 likes
Here is the generated social media post:

Local LLM update! We've implemented OLLAMA 3.1 locally, reducing API reliance & costs. More control over data, less expenses. This means better performance and lower bills for you! #localLLM #costsavings
October 24, 2025 at 4:20 PM Everybody can reply
Beyond aggregation, users want enhanced search and integration with local LLMs for personalized insights. Imagine asking your timeline questions and getting smart, private answers based on *your* data. The future is personal AI. #LocalLLM 5/6
October 8, 2025 at 4:00 PM Everybody can reply
anyways long story short lol, local models aren’t there yet but DAMN they’re progressing so damn fast. and especially if you have access to an apple silicon pc with a good amount of ram, id use that as a localllm server. im trying to get a used 32gb m4 mac mini in order to achieve that
September 28, 2025 at 4:34 PM Everybody can reply
1 likes
8/ Interested In Open Source AI?

Bookmark my account (a.k.a, "follow me") or keep an eye on my posts.

I'm writing a book series on offline AI (a.k.a, local AI; relevant: #LocalLLM) in which I introduce an open source tool that combines lessons learned with rigor into a simplified user experience.
September 24, 2025 at 4:48 AM Everybody can reply
2 likes
itsfoss.com/android-on-d...

Running Local LLMs on an android, because: reasons.

Some of the apps discussed:
- MLC Chat
- SmolChat
- Google AI Edge Gallery

Also the specific LLM's tested:
- Google’s Gemma
- Meta’s Llama
- Microsoft’s Phi-3
- Qwen
- TinyLlama

#AI #LLM #LocalLLM #Android
I Ran Local LLMs on My Android Phone
I love the idea of having an AI assistant that works exclusively for me, no monthly fees, no data leaks. I won’t lie, it’s not perfect. Here's my experience.
itsfoss.com
September 15, 2025 at 7:11 PM Everybody can reply
RTX 3090s are the hobbyist LocalLLM sweet spot right now. I wonder how much this is affecting prices in the used market?
September 14, 2025 at 1:13 AM Everybody can reply
What all in one AI desktop app are you using? I want something that works with local LLMs (Llama, Qwen, Mistral, SD, Flux, etc on my networked Linux machine) and online (ChatGPT, Gemini, Claude). I want to experiment, train, and keep some chats local.

What are you using?
#localLLM #opensource #ai
September 13, 2025 at 5:59 PM Everybody can reply
1 likes
Anyway, Qwen3-4B-Instruct-2507-Q8_0 (without "reasoning") is actually okay (for its size and relatively speaking). As per itself, it is not "a live verification system". # LLM # interview # genAI # localLLM # review # ontology

Interest | Match | Feed
Origin
mastodon.social
September 10, 2025 at 6:01 PM Everybody can reply
ローカルLLMの歴史を学んで可能性を考える https:// qiita.com/yo_arai/items/f10fc5 52200f4abfe9d5?utm_campaign=popular_items&utm_medium=feed&utm_source=popular_items # qiita # OpenAI # 生成AI # LLM # LocalLLM

Interest | Match | Feed
Origin
rss-mstdn.studiofreesia.com
September 3, 2025 at 12:41 PM Everybody can reply
1 likes
Bring your own brain? Why local LLMs are taking off www.theregister.com/2025/08/31/l... by Danny Bradbury

#localLLM #genAI #dataretention #techsovereignty #techpolicy
September 2, 2025 at 5:44 PM Everybody can reply
1 reposts 5 likes
The best place to follow progress and contribute is here: www.reddit.com/r/LocalLLM/
www.reddit.com
August 30, 2025 at 10:42 PM Everybody can reply
Local LLMs with openAI-compliant tool-calling course

9 chapters. 3 weeks. (8/11/-8/29)

Chapter 9: Interactive UI - Visualizing Model Responses

github.com/jrbrowning/o...

#LocalLLM #OpenSource #ToolCalling #OllamaToolsAPI #OAIToolsUI #FastAPI
August 29, 2025 at 9:48 PM Everybody can reply
2 likes
🦙📱 Build a mobile app in Blazor for your Local LLM! No cloud. No subscriptions. Just code.
🎥 Tutorial → youtu.be/5wWswrwYkUo

#Blazor #AI #LocalLLM #DotNet
Build Mobile App for your Local LLM in Blazor
YouTube video by Hassan Habib
youtu.be
August 29, 2025 at 2:46 PM Everybody can reply