AI / LLMs — privacy-respecting options » Local Runtime
B LM Studio
● up · checked 38m ago
Desktop GUI for local LLMs. Drag-drop GGUF models, OpenAI-compatible local API, no cloud.
At a glance
- no-KYC signup
- Non-custodial — you hold keys
- Self-hostable
Review
Cross-platform desktop app (macOS / Windows / Linux) that wraps llama.cpp with a polished UI. Discover + download GGUF quantised models from Hugging Face, chat locally, expose an OpenAI-compatible API on localhost for apps to consume. Closed-source UI but the runtime is open. Best on-ramp for non-engineers.
Fees
Free for personal · MacOS/Win/Linux · localhost API
Links
Audit trail — receipts for the editorial claim
- Upstream up · HTTP 200 · 576ms · checked 38m ago
- No
.onionmirror listed - Last manual verification
2026-05-13(<7d) - See curator log for LM Studio
Reviews — moderated · rules
No approved reviews yet. Be the first.
Add a review
Honest, brand-neutral feedback welcome. A curator approves before it appears here. No JS, no signup required.