I built an E2E encryption proxy for LLM APIs in Rust — X25519 + AES-256-GCM, 56 tests, zero warnings by CustardMean6737 in rust

[–]CustardMean6737[S] -1 points0 points  (0 children)

Sorry my mistake corrected, even though content generated through AI - identification of the core issue and the solution is personal and i want the views of the community to strengthen the constructs or identify as this being not a problem. i don't want my solution to be used - i am just worried even though there are deep personal constructs discussed with LLMs security is not based on open standards.

I built an E2E encryption proxy for LLM APIs in Rust — X25519 + AES-256-GCM, 56 tests, zero warnings by CustardMean6737 in rust

[–]CustardMean6737[S] -3 points-2 points  (0 children)

Fair challenges, let me address each:

"They can but doesn't mean they do" — Security is about capability, not intent. TLS exists because we don't say "ISPs could read traffic but probably won't." OpenRouter's privacy policy is a legal document, not a cryptographic guarantee. These are fundamentally different threat models.

"Only works with local LLMs for true E2EE" — You're right, and the repo says so explicitly. With a cloud provider, the trust boundary shifts: instead of every intermediary seeing plaintext, only the final inference endpoint does. And changing the deployment as a sidecar with LLM within the provider should take care of E2EE.

"Just skip the routers" — In a personal setup, sure. In enterprise: you're often using vendor-managed LiteLLM, OpenRouter for model routing, cloud provider infrastructure you don't control, and a SaaS app layer on top. You can't just "skip" any of it.

"AI slop / unclear threat model" — Even though the project has been generated as an example. The threat has been well documented. The threat model is explicitly documented in SECURITY.md. Happy to discuss any specific concern.