AI / LLMs — privacy-respecting options » Confidential Compute
A Privatemode
● up · checked 38m ago
Confidential-compute LLM proxy — encrypted inference on Nvidia confidential GPUs.
At a glance
- no-KYC signup
- Open-source codebase
Review
Runs the inference inside an attested TEE so even the operator can't see prompts. Open-source clients; pay-per-token, crypto accepted. Closest the LLM space gets to "actually private" right now.
Fees
Per-token · BTC / fiat · confidential-compute backend
Links
Audit trail — receipts for the editorial claim
- Upstream up · HTTP 200 · 894ms · checked 38m ago
- No
.onionmirror listed - Last manual verification
2026-05-12(<7d) - See curator log for Privatemode
Reviews — moderated · rules
No approved reviews yet. Be the first.
Add a review
Honest, brand-neutral feedback welcome. A curator approves before it appears here. No JS, no signup required.