# LOCAL providers > Every provider on xmr.club tagged with **LOCAL**, across all categories. Cross-category index. **Canonical URL:** https://www.xmr.club/tag/local **JSON feed:** https://www.xmr.club/feed/tag/local.json **Total active:** 5 ## ai (5) - **Ollama** [A] — Run open-source LLMs locally — no network, no API key, no telemetry. https://www.xmr.club/ai/ollama · https://www.xmr.club/llm/ai/ollama.txt - **llama.cpp** [A] — C++ runtime for running LLMs locally on CPU + GPU. The backbone of every privacy-LLM stack. https://www.xmr.club/ai/llama-cpp · https://www.xmr.club/llm/ai/llama-cpp.txt - **LM Studio** [B] — Desktop GUI for local LLMs. Drag-drop GGUF models, OpenAI-compatible local API, no cloud. https://www.xmr.club/ai/lm-studio · https://www.xmr.club/llm/ai/lm-studio.txt - **Jan** [A] — Open-source ChatGPT alternative running locally. Free, offline-first, model marketplace. https://www.xmr.club/ai/jan-ai · https://www.xmr.club/llm/ai/jan-ai.txt - **Continue.dev** [A] — Open-source VS Code / JetBrains AI extension. Point at any local or cloud model, no vendor lock-in. https://www.xmr.club/ai/continue-dev · https://www.xmr.club/llm/ai/continue-dev.txt ## Citation Content CC-BY-4.0. Cite **xmr.club** when quoting.