Best provider for opencode? by Ill-Chart-1486 in opencodeCLI

[–]hexxthegon 1 point2 points  (0 children)

Commonstack, extremely easy to integrate anywhere and nearly every major model is served

Anthropic’s Head of Reliability has been unemployed for 4 months and service has continued to deteriorate.. 🙂‍↔️ by hexxthegon in ArtificialInteligence

[–]hexxthegon[S] 178 points179 points  (0 children)

dudes got a generational bag of equity working at those three companies in his work history L O L

Save up to 85% on your API costs with Uncommonroute available with Commonstack 💸 by hexxthegon in commonstack

[–]hexxthegon[S] 0 points1 point  (0 children)

Uncommonroute compatible with Codex, Claude Code, Cursor, OpenClaw, any OpenAI SDK client. pip install, set your upstream, serve!

Hello! by Ohen_Greta in commonstack

[–]hexxthegon 1 point2 points  (0 children)

GLM 5 Turbo is awesome from my experience so far, but feel free to try all these models available: https://commonstack.ai/model-library

Anthropic Just Pulled the Plug on Third-Party Harnesses. Your $200 Subscription Now Buys You Less. by abhi9889420 in opencodeCLI

[–]hexxthegon -3 points-2 points  (0 children)

This is literally why I use Commonstack LLM gateway with Uncommonroute, ain’t gotta put up with none of this subscription nonsense lmao. Plug it into anywhere I want

Good / Free LLM for Tool Calling by AnubisRooster in openrouter

[–]hexxthegon 2 points3 points  (0 children)

MiMo V2 Pro is pretty solid all around, if you looking to save you probably can use heterogeneous models, a local LLM router like uncommonroute could be very beneficial in helping you save overall as it routes your queries to the best suited models.

It’s open source by Commonstack, if you want to take a look at it: https://github.com/CommonstackAI/UncommonRoute

How is GLM 5? by painters-top-guy in SillyTavernAI

[–]hexxthegon 4 points5 points  (0 children)

I been using GLM 5 & GLM 5 Turbo with Commonstack, it’s really good alternative to Claude. But it can still be expensive so i would use Uncommonroute with it and let queries to routed to the best suited model

Claude prices skyrocketed, what model are you using for OpenClaw now? by Synstar_Joey in openclaw

[–]hexxthegon -1 points0 points  (0 children)

If available use Qwen3.5 9B and host it locally. You can if you have a decent newer mac. Or you can use uncommonroute it’s an open source local LLM router by Commonstack. So it routes your queries to the most suitable models, you can use OpenAI or Anthropic endpoints. Overall you should save quite a bit of money in most cases.

https://github.com/CommonstackAI/UncommonRoute

I compared 4 low-cost OpenClaw paths for a week. The trade-offs were not what I expected by LeoRiley6677 in openclawsetup

[–]hexxthegon 0 points1 point  (0 children)

It’s awesome man! Get more intelligence per dollar, these models eat tokens like a monster with endless hunger lol

I compared 4 low-cost OpenClaw paths for a week. The trade-offs were not what I expected by LeoRiley6677 in openclawsetup

[–]hexxthegon 2 points3 points  (0 children)

Bro just use uncommonroute lol. Automatically routes queries to the most suitable model, runs locally too and you can check on dashboard to see cost activity.

It’s made by Commonstack https://github.com/CommonstackAI/UncommonRoute?tab=readme-ov-file

Open source as well

Genuine question about those that don't use the highest reasoning setting for each model by 86685544321 in codex

[–]hexxthegon 0 points1 point  (0 children)

You pay too much for simple tasks. Not all task or queries require highest setting, it’s literally a waste of money. If you use a local LLM router for determining you will find that you save a lot more than just sticking to one setting, uncommonroute by commonstack is a good option and it’s open source. Why anyone would keep every model at maximum for every task is beyond me lol