# Privatemode > Confidential-compute LLM proxy — encrypted inference on Nvidia confidential GPUs. **Canonical URL:** https://www.xmr.club/ai/privatemode **Category:** ai / Confidential Compute **Grade (xmr.club rubric):** A **KYC posture:** anonymous_signup **Features:** open_source, api_available **Highlights:** CONFIDENTIAL, TEE **Fees:** Per-token · BTC / fiat · confidential-compute backend **Website:** https://www.privatemode.ai **Last verified:** 2026-05-12 **Uptime probe:** up (HTTP 200, 894ms) · checked 2026-05-13T23:04:43.181Z ## Editorial review Runs the inference inside an attested TEE so even the operator can't see prompts. Open-source clients; pay-per-token, crypto accepted. Closest the LLM space gets to "actually private" right now. ## Citation When quoting this entry, cite **xmr.club** and link the canonical URL above. Content CC-BY-4.0.