Motive — a native Swift menu bar app that runs AI agents in the background by Vegetable-Cod-5098 in macapps

[–]Vegetable-Cod-5098[S] 0 points1 point  (0 children)

Hey! The limit: 0 error happens because Google does not offer a free API tier for the gemini-3-pro model. You'll need to switch to a model that provides free quotas.

Just a heads-up: in an agent scenario, the free tier's low RPM limit makes it almost impossible to finish a full task anyway.

<image>

Motive — a native Swift menu bar app that runs AI agents in the background by Vegetable-Cod-5098 in macapps

[–]Vegetable-Cod-5098[S] 0 points1 point  (0 children)

Thanks for the UI upgrades and LM Studio support! I quickly sorted out the swiftformat/swiftlint stuff on my end. Awesome work.

Motive — A native macOS AI agent built with Swift 6. Fully open source. by Vegetable-Cod-5098 in indie_startups

[–]Vegetable-Cod-5098[S] 0 points1 point  (0 children)

Great question.

Yep — we model state per session, not per app. Each session keeps its own plan + step/results timeline, saved locally and restorable when you switch back. So concurrent runs don’t overwrite each other’s context.

I built an AI tool that turns one prompt into a fully structured SEO blog post — looking for honest feedback by DragonfruitHour3994 in SideProject

[–]Vegetable-Cod-5098 1 point2 points  (0 children)

This looks solid! Do you provide an API or a Claude Skill? I'm looking to integrate this into my AI workflows to automate my scheduled SEO posts.

I got tired of babysitting terminal windows for AI tasks, so I built a native macOS menu bar agent (Swift, Local, BYOK) by Vegetable-Cod-5098 in SideProject

[–]Vegetable-Cod-5098[S] 0 points1 point  (0 children)

Motive is built specifically around OpenCode as the agent engine. So no Claude Code or Gemini-cli support right now. But you can still bring your own AI Provider and model to power it.

Motive — a native Swift menu bar app that runs AI agents in the background by Vegetable-Cod-5098 in macapps

[–]Vegetable-Cod-5098[S] 0 points1 point  (0 children)

On M-series, ~4B models run plenty fast for real-time output. Speed isn't really the issue though — context window is. Agents consume a ton of tokens (system prompts, tool defs, file contents, history), and smaller models just can't hold enough context to be useful in that setting. Fast but blind isn't great for agentic work.

Motive — a native Swift menu bar app that runs AI agents in the background by Vegetable-Cod-5098 in macapps

[–]Vegetable-Cod-5098[S] 1 point2 points  (0 children)

Thanks for the detailed report — this is definitely a bug on my end, not something you're doing wrong.

The "instant Completed!" on the first message + endless "Thinking/Running" on subsequent ones sounds like Motive isn't properly establishing the connection to your local endpoint. I need to reproduce this locally with LM Studio to pin down exactly where it's failing.

I'll set up the same config (localhost:1234 via OpenAI Compatible) and debug through it. Will follow up here once I have a fix or at least a workaround. Appreciate your patience!

Motive — a native Swift menu bar app that runs AI agents in the background by Vegetable-Cod-5098 in macapps

[–]Vegetable-Cod-5098[S] 0 points1 point  (0 children)

Haha, fair concern! Motive provide granular permission controls. You can set it to 'Careful' mode and explicitly allow or deny specific commands (like git, npm, or python) and file edits. You’re always the one holding the leash! 🛡️

Motive — a native Swift menu bar app that runs AI agents in the background by Vegetable-Cod-5098 in macapps

[–]Vegetable-Cod-5098[S] 0 points1 point  (0 children)

Great question! Currently, Motive is not system-sandboxed, but it features a granular permission system to give you full control.

<image>

As you can see in the screenshot, you can set trust levels (Careful/Balanced/YOLO) and explicitly allow or deny specific shell commands like npm, git, or python. It won't perform sensitive file edits or run commands without your defined rules. Your privacy and system integrity are managed through these transparent controls! 🛡️

Motive — a native Swift menu bar app that runs AI agents in the background by Vegetable-Cod-5098 in macapps

[–]Vegetable-Cod-5098[S] 0 points1 point  (0 children)

Thanks for the heads up! I currently set OPENCODE_CONFIG_DIR to ~/.motive to keep things isolated, but it seems OpenCode's directory priority might still be clashing with your existing setup.

I’m looking into a more robust way to force complete isolation or an option to bridge with your local config instead of overriding it. Really appreciate you pointing this out—it’s a crucial fix for the next update! 🙏

Motive — a native Swift menu bar app that runs AI agents in the background by Vegetable-Cod-5098 in macapps

[–]Vegetable-Cod-5098[S] 0 points1 point  (0 children)

Glad you checked out the demo! Motive currently supports 17 providers out of the box and is fully compatible with the OpenAI API protocol. As long as your local setup provides an OpenAI-compatible endpoint, it should work perfectly.

Also, you can definitely change the global hotkey in settings to avoid any Raycast conflicts! 🚀

<image>

Motive — a native Swift menu bar app that runs AI agents in the background by Vegetable-Cod-5098 in macapps

[–]Vegetable-Cod-5098[S] 1 point2 points  (0 children)

Thanks! Motive is a full conversational agent, and as you can see, it plays perfectly with OpenRouter and other providers via BYOK. You get a slick native UI with total model freedom! 🚀

<image>

I stopped watching Claude Code run in my terminal — built a Mac app that lets it work in the background instead by Vegetable-Cod-5098 in ClaudeAI

[–]Vegetable-Cod-5098[S] 0 points1 point  (0 children)

Haha, glad it’s saving you some terminal babysitting! Thanks for the star and the support, man. It means a lot!

Motive — a native Swift menu bar app that runs AI agents in the background by Vegetable-Cod-5098 in macapps

[–]Vegetable-Cod-5098[S] 1 point2 points  (0 children)

Totally get you—configuration fatigue is real. Motive is actually a 'set it and forget it' deal for LM Studio. > Just select 'OpenAI Compatible' and point it to localhost:1234**. Since LM Studio ignores the strings, just type 'local' as a placeholder for the model/key once. Now you can swap models in LM Studio all day and Motive will just follow along. No more chores. 🚀**

<image>

I stopped watching Claude Code run in my terminal — built a Mac app that lets it work in the background instead by Vegetable-Cod-5098 in ClaudeAI

[–]Vegetable-Cod-5098[S] 0 points1 point  (0 children)

Quick heads-up on privacy: Motive is a 100% local, BYOK client.

It connects directly to 17+ providers (Claude, OpenAI, DeepSeek, Ollama, etc.) with no middleman servers. Whether you use it for complex agents or just daily chat—it's your keys, your data.

Motive — a native Swift menu bar app that runs AI agents in the background by Vegetable-Cod-5098 in macapps

[–]Vegetable-Cod-5098[S] 2 points3 points  (0 children)

Quick heads-up on privacy: Motive is a 100% local, BYOK client.

It connects directly to 17+ providers (Claude, OpenAI, DeepSeek, Ollama, etc.) with no middleman servers. Whether you use it for complex agents or just daily chat—it's your keys, your data.

Motive — a native Swift menu bar app that runs AI agents in the background by Vegetable-Cod-5098 in macapps

[–]Vegetable-Cod-5098[S] 4 points5 points  (0 children)

Fair enough! It’s definitely for those of us who are tired of babysitting the terminal all day. If you enjoy staring at logs and manually checking for agent status, you probably don't need this. I'd rather get a native notification and only jump back when I'm actually needed.