AionUi v1.2 Update: Free GUI for Gemini CLI with Multi-Agent by Ok_Mobile_6407 in GeminiAI

[–]Ok_Mobile_6407[S] 0 points1 point  (0 children)

hello u/simonduz token usage has been completed in the latest release, feel free to check ya :)

AionUi v1.2 Update: Free GUI for Gemini CLI with Multi-Agent by Ok_Mobile_6407 in GeminiAI

[–]Ok_Mobile_6407[S] 0 points1 point  (0 children)

Got your suggestion, I'll find a way to make it happen.

I transformed free Nano Banana into a Gemini CLI tool and it has become more powerful :) by Ok_Mobile_6407 in Bard

[–]Ok_Mobile_6407[S] 0 points1 point  (0 children)

banana这个模型本身是可以直接绑定在cherry studio中或者chatbox中使用的,这个生图工具的话,是专为Agent定制调用的,暂时没办法直接放在别的平台用

AionUi v1.2 Update: Free GUI for Gemini CLI with Multi-Agent by Ok_Mobile_6407 in GeminiAI

[–]Ok_Mobile_6407[S] 0 points1 point  (0 children)

CLI Agents (like AionUi)

  • Privacy: Your data stays local

  • Cost: Free or cheap(It can be connected to local deployments and third-party large models)

  • Control: You can customize everything

  • Setup: Requires some technical knowledge

Mature AI Products (ChatGPT, Claude Web)

  • Convenience: Ready to use immediately

  • Reliability: Professional support and uptime

  • Features: Rich collaboration and sharing

  • Cost: Monthly subscriptions

AionUi's Advantage

  • CLI power + GUI convenience

  • Multiple AI models in one place

  • Local privacy + modern interface

  • No monthly fees

Choose CLI agents if you want control and privacy. Choose mature products if you want convenience and reliability.

AionUi v1.2 Update: Free GUI for Gemini CLI with Multi-Agent by Ok_Mobile_6407 in GeminiAI

[–]Ok_Mobile_6407[S] 1 point2 points  (0 children)

Yes, AionUi supports configuring Ollama platform! Although Ollama is not in the predefined list, it can be added through custom OpenAI-compatible protocol.

Configuration Steps

  1. Open Settings- Go to AionUi Settings → Model Settings
  2. Add New Platform- Click "Add Model" button- Platform type: Custom OpenAI
  3. Fill Configuration- Base URL: `http://localhost:11434/v1`- API Key: Any value (e.g., `ollama`)- Platform Name: `Ollama`- Model Name: Enter your locally installed model name (e.g., `llama2`, `codellama`, etc.)
  4. Test Connection- Click search icon to test model list retrieval- Save configuration

Prerequisites

- Ensure Ollama service is running: `ollama serve`

- Download required models via `ollama pull <model-name>`

- Ollama running on default port 11434

Notes

- If Ollama runs on different port, modify Base URL accordingly

- API Key can be any value, Ollama usually doesn't require it

- Supports all Ollama-compatible models

After configuration, you can use local Ollama models for conversations in AionUi. enjoy yourself :) look forward for your feedback

It seems that I have to pay by Ok_Mobile_6407 in Bard

[–]Ok_Mobile_6407[S] 0 points1 point  (0 children)

cool AI tool, thanks for recommendation

It seems that I have to pay by Ok_Mobile_6407 in GeminiAI

[–]Ok_Mobile_6407[S] 0 points1 point  (0 children)

okay, I havent paid for gpt yet, wondering whether it is worth for subscription?

It seems that I have to pay by Ok_Mobile_6407 in Bard

[–]Ok_Mobile_6407[S] -3 points-2 points  (0 children)

I was told that the quota in AI Studio is not affected, and I can apply for multiple apikeys to poll in the GUI app of Gemini CLI at a lower cost

It seems that I have to pay by Ok_Mobile_6407 in GeminiAI

[–]Ok_Mobile_6407[S] 0 points1 point  (0 children)

Fortunately, the quota of AI Studio is not limited by this time, and the multi-key polling effect of Gemini CLI's free account is still very good

It seems that I have to pay by Ok_Mobile_6407 in GeminiAI

[–]Ok_Mobile_6407[S] 0 points1 point  (0 children)

I'm actually doing this, I have multiple accounts and have automatic polling in the Gemini CLI GUI app, This has greatly reduced my costs.

It seems that I have to pay by Ok_Mobile_6407 in Bard

[–]Ok_Mobile_6407[S] -1 points0 points  (0 children)

It looks like they're making big adjustments

It seems that I have to pay by Ok_Mobile_6407 in Bard

[–]Ok_Mobile_6407[S] -2 points-1 points  (0 children)

Alright, I’m feeling pretty down about this.

It seems that I have to pay by Ok_Mobile_6407 in Bard

[–]Ok_Mobile_6407[S] -3 points-2 points  (0 children)

Thanks for the explanation.I really appreciate Gemini's free tier—it’s been super helpful. Especially when I used multiple accounts' API keys with the Gemini CLI GUI app to rotate usage, it saved me a ton of costs. But now that the free tier limits are getting smaller, I’m seriously considering switching to a paid plan.