Hacker News with Generative AI: Local Computing

AMD launches Gaia open source project for running LLMs locally on any PC (tomshardware.com)
AMD launches Gaia open source project for running LLMs locally on any PC (tomshardware.com)
Local Deep Research – ArXiv, wiki and other searches included (github.com/LearningCircuit)
A powerful AI-powered research assistant that performs deep, iterative analysis using multiple LLMs and web searches. The system can be run locally for privacy or configured to use cloud-based LLMs for enhanced capabilities.
Run Deepseek R1 Locally on an iPhone (apple.com)
Pocket: Der einfachste Weg, private LLMs zu nutzen. Es funktioniert vollständig offline und gewährleistet Ihre Privatsphäre, auch ohne Internetverbindung.
Ask HN: How much would you pay for local LLMs? (ycombinator.com)
I want to build a private AI setup for my company. Im thinking of hosting our model locally instead of in the cloud, using a server at the office that my team can access. Has anyone else done this and had success with it?
Forget ChatGPT: why researchers now run small AIs on their laptops (nature.com)
Artificial-intelligence models are typically used online, but a host of openly available tools is changing that. Here’s how to get started with local AIs.
Ollama now supports tool calling with popular models in local LLM (ollama.com)
Run Google Gemma 2 2B 100% Locally (ycombinator.com)
New FOSS All-Local AI with Ollama, Gemma 2, and Fabric (ycombinator.com)
Ask HN: Which LLMs can run locally on most consumer computers (ycombinator.com)