Hacker News with Generative AI: Local Computing

Run Deepseek R1 Locally on an iPhone (apple.com)
Pocket: Der einfachste Weg, private LLMs zu nutzen. Es funktioniert vollständig offline und gewährleistet Ihre Privatsphäre, auch ohne Internetverbindung.
Ask HN: How much would you pay for local LLMs? (ycombinator.com)
I want to build a private AI setup for my company. Im thinking of hosting our model locally instead of in the cloud, using a server at the office that my team can access. Has anyone else done this and had success with it?
Forget ChatGPT: why researchers now run small AIs on their laptops (nature.com)
Artificial-intelligence models are typically used online, but a host of openly available tools is changing that. Here’s how to get started with local AIs.
Ollama now supports tool calling with popular models in local LLM (ollama.com)
Run Google Gemma 2 2B 100% Locally (ycombinator.com)
New FOSS All-Local AI with Ollama, Gemma 2, and Fabric (ycombinator.com)
Ask HN: Which LLMs can run locally on most consumer computers (ycombinator.com)