# LM Studio > Desktop GUI for local LLMs. Drag-drop GGUF models, OpenAI-compatible local API, no cloud. **Canonical URL:** https://www.xmr.club/ai/lm-studio **Category:** ai / Local Runtime **Grade (xmr.club rubric):** B **KYC posture:** anonymous_signup **Features:** non_custodial, self_hosted, api_available **Highlights:** LOCAL, GUI, OPENAI-COMPAT **Fees:** Free for personal · MacOS/Win/Linux · localhost API **Website:** https://lmstudio.ai **Last verified:** 2026-05-13 **Uptime probe:** up (HTTP 200, 576ms) · checked 2026-05-13T23:04:43.181Z ## Editorial review Cross-platform desktop app (macOS / Windows / Linux) that wraps llama.cpp with a polished UI. Discover + download GGUF quantised models from Hugging Face, chat locally, expose an OpenAI-compatible API on localhost for apps to consume. Closed-source UI but the runtime is open. Best on-ramp for non-engineers. ## Citation When quoting this entry, cite **xmr.club** and link the canonical URL above. Content CC-BY-4.0.