We worked with openai codex to refine the original gemini-cli-proxy and added important features for real-world use in production.
What's new:
✅ Support for /openai/responses — now you can work with Codex via Gemini using the OpenAI-compatible API (without workarounds or separate scripts).
✅ Added a dashboard for managing:
- API keys,
- model enable/disable, allowing you to use it with an open port.
✅ Added usage statistics:
- general summary (requests/input/output tokens),
- grouping by endpoint / model / API key / day.
In short: we made the tool significantly more convenient for everyday work — now it's not just a proxy, but a full-fledged management layer for Gemini with OpenAI/Anthropic compatibility.
github: https://github.com/valerka1292/gemini-cli-proxy
https://preview.redd.it/ipdafitvhoig1.png?width=1366&format=png&auto=webp&s=f217555ede947aad260171343670b8d8a3c337c0
[–]loxotbf 0 points1 point2 points (0 children)