xmr.club search + submit independent · curated · graded
← back to AI / LLMs — privacy-respecting options

AI / LLMs — privacy-respecting options » Local Runtime

A Ollama

● up · checked 41m ago

Run open-source LLMs locally — no network, no API key, no telemetry.

Visit Ollama →

At a glance

  • no-KYC signup
  • Open-source codebase
  • Self-hostable

Review

The reference local-LLM runtime. Pulls quantised model weights, exposes an OpenAI-compatible API on localhost. Works on CPU + GPU + Apple Silicon. The strongest privacy posture available because the inference never leaves your machine.

Fees

Free · MIT · runtime + model weights local

Links

Audit trail — receipts for the editorial claim

  • Upstream up · HTTP 200 · 318ms · checked 41m ago
  • No .onion mirror listed
  • Last manual verification 2026-05-12 (<7d)
  • See curator log for Ollama

Embed this listing — for the provider

If you run Ollama, drop this on your site. The badge auto-reflects the latest grade and pick status, links back to this page.

Listed on xmr.club — Ollama
<a href="https://www.xmr.club/ai/ollama">
  <img src="https://www.xmr.club/badge/ollama.svg" alt="Listed on xmr.club — Ollama" height="32" />
</a>

Reviews — moderated · rules

No approved reviews yet. Be the first.

Add a review

Honest, brand-neutral feedback welcome. A curator approves before it appears here. No JS, no signup required.

Required: review body. Honest, descriptive reviews get approved within a day. Marketing copy, slurs, or invective get rejected. Per-day cap of 5 submissions per IP.

⚠ report an issue / suggest correction · view markdown twin · share on X