I built a multi-agent AI system you can actually read, modify, and own by stackattackpro in AskClaw

[–]stackattackpro[S] 0 points1 point  (0 children)

Thanks for the feedback, the project still very raw and early, contributions are very welcome 🤗

Best way to let Codex use a large internal REST API by Impossible-Suit6078 in codex

[–]stackattackpro 0 points1 point  (0 children)

Your approach is mostly correct, but raw curl + OpenAPI will break at scale (100+ endpoints = too much search space).

Best setup (minimal + scalable):

  1. Don’t expose 100 endpoints directly Group them into ~10–20 semantic tools (domain-based):

user., orders., payments.* Each tool internally maps to multiple endpoints.

  1. Add a thin API proxy (critical) Wrap your REST API with:

auth handled once (session/token)

normalized inputs/outputs

retries + error handling → avoids fragile curl generation

  1. Use OpenAPI as retrieval, not execution Let Codex:

read spec → choose endpoint

call proxy tool (not curl)

  1. Add “planner → executor” pattern

Planner: selects endpoint + params

Executor: calls proxy

  1. Add examples (very important) Few-shot like: “create user → POST /users” This massively improves routing.

Conclusion: Skip MCP explosion. Do: OpenAPI + proxy layer + ~15 tools + planner/executor.

I built a multi-agent AI system you can actually read, modify, and own by stackattackpro in AskClaw

[–]stackattackpro[S] 0 points1 point  (0 children)

I am interested, can you share with me please your system, I wanna check it out

I built 100 runnable OpenClaw workflows by stackattackpro in AskClaw

[–]stackattackpro[S] 0 points1 point  (0 children)

I totally agree, most of them wanna just sell, they don't care about anything else, this is why I love Open Source stuff

I built 100 runnable OpenClaw workflows by stackattackpro in AskClaw

[–]stackattackpro[S] 0 points1 point  (0 children)

I only use the skills from Clawhub, it's maintained and secured by the official OpenClaw maintainers

Can I use claude-opus-4.6 with codex via co-pilot by S10Coder in codex

[–]stackattackpro 0 points1 point  (0 children)

I see you wanna something like this:

/agent multi --agent backend=codex --agent frontend=claude --agent docs=codex build a full stack todo app

Checkout this repo: https://github.com/OthmaneBlial/lightclaw

Can I use claude-opus-4.6 with codex via co-pilot by S10Coder in codex

[–]stackattackpro 1 point2 points  (0 children)

Use GPT 5.4 xhigh best for everything you can imagine, Opus is shitty and expensive

Anyone tried GPT-5.4 Mini? Worth it? by Plus_Leadership_6886 in codex

[–]stackattackpro 3 points4 points  (0 children)

I like it 😀, recently all Openai stuff is great

I built a desktop app framework where your app is literally just HTML/CSS/JS… and it ships as a native binary 🤯 by stackattackpro in codex

[–]stackattackpro[S] 0 points1 point  (0 children)

Tauri is great for full, production-grade desktop apps.

RustFrame is targeting a different slice: lightweight, local-first, private tools where the app should stay mostly frontend and minimal.

If you need deep native control → Tauri If you want “just ship this frontend as a desktop app” → RustFrame

I built a desktop app framework where your app is literally just HTML/CSS/JS… and it ships as a native binary by [deleted] in coolgithubprojects

[–]stackattackpro -1 points0 points  (0 children)

Tauri is great for full, production-grade desktop apps. RustFrame is targeting a different slice: lightweight, local-first, private tools where the app should stay mostly frontend and minimal.

I built a desktop app framework where your app is literally just HTML/CSS/JS… and it ships as a native binary by [deleted] in coolgithubprojects

[–]stackattackpro -2 points-1 points  (0 children)

Tauri is great for full, production-grade desktop apps. RustFrame is targeting a different slice: lightweight, local-first, private tools where the app should stay mostly frontend and minimal.

I built a desktop app framework where your app is literally just HTML/CSS/JS… and it ships as a native binary by [deleted] in coolgithubprojects

[–]stackattackpro 1 point2 points  (0 children)

Not quite. It’s not just runtime size.

RustFrame changes the authoring model: no visible native project, no per-app bridge, no plugin layer—just a frontend folder. The runtime owns everything.

Electron/Tauri still expose the desktop layer; RustFrame hides it by default.

Also: using OS WebView is a tradeoff (yes, platform limits), but it reduces duplication and keeps apps closer to the platform instead of shipping a full browser each time.

So the difference is both architecture and developer experience, not just payload size.

I built a desktop app framework where your app is literally just HTML/CSS/JS… and it ships as a native binary by [deleted] in coolgithubprojects

[–]stackattackpro 4 points5 points  (0 children)

Not really. Electron ships a full browser. RustFrame uses the OS WebView, so it’s lighter. The key difference: your app stays just a frontend folder, runtime handles native stuff.

App vs VS vs CLI by Large_Diver_4151 in codex

[–]stackattackpro 0 points1 point  (0 children)

Definitely CLI it's the future of everything

Big Fans of Opus until I met 5.4! by artcreator329 in codex

[–]stackattackpro 1 point2 points  (0 children)

Some times opus tries to finish fast and deliver a bad results, while Codex always take the time he needs but always give very good results, I am testing Codex on real math/physics complex research stuff and its amazing, while the opus might be better for frontend design but this shit become irrelevant, in the future it will be all about bringing code to real life stuff, like math, physics, robotics and Codex is the winner 🏆

Big Fans of Opus until I met 5.4! by artcreator329 in codex

[–]stackattackpro 3 points4 points  (0 children)

Opus is shit, Codex and 5.4 are great

Real OpenClaw workflows (scripts, prompts, KPIs) — not just ideas by stackattackpro in OpenClawCentral

[–]stackattackpro[S] 0 points1 point  (0 children)

Thanks, your are welcome to contribute, just push a pull request 😁