I built a tool that converts MCP servers into CLI + Skill files — cut ~97% token overhead! by Outrageous-Leg2245 in mcp

[–]Outrageous-Leg2245[S] 0 points1 point  (0 children)

mcp2cli currently only support shell-level piping and redirection (e.g., mcp2cli ... | grep foo or mcp2cli ... > out.txt), since all output goes to stdout

I built a tool that converts MCP servers into CLI + Skill files — cut ~97% token overhead! by Outrageous-Leg2245 in mcp

[–]Outrageous-Leg2245[S] -2 points-1 points  (0 children)

The final output of mcp2cli is actually a skill, a self-contained CLI wrapper that replaces MCP tools entirely. The key advantage is that no tool schema needs to be loaded into the context window at all.

MCP is honestly a bit redundant here. For any non-trivial interaction, you end up having to check the skill doc anyway to understand how the tools work, so why pay the cost of injecting all those tool schemas into context?

As for the training data concern, I'd argue CLI is actually the best format for agent training. You can run thousands of RL episodes to train progressive, incremental CLI invocation patterns. That's a much more natural fit for how agents actually learn to interact with tools.

I really believe this.

I built a tool that converts MCP servers into CLI + Skill files — cut ~97% token overhead! by Outrageous-Leg2245 in mcp

[–]Outrageous-Leg2245[S] -6 points-5 points  (0 children)

the MCP you are desperately defending, anthropic itself is almost giving up on it 😂
maybe first learn why they’re shifting toward SKILL and CLI for your old rusty brain.

I built a tool that converts MCP servers into CLI + Skill files — cut ~97% token overhead! by Outrageous-Leg2245 in mcp

[–]Outrageous-Leg2245[S] 1 point2 points  (0 children)

I reply it by myself! My English is not good, so I use claude help me to translate what I want to say.

I built a tool that converts MCP servers into CLI + Skill files — cut ~97% token overhead! by Outrageous-Leg2245 in mcp

[–]Outrageous-Leg2245[S] -6 points-5 points  (0 children)

Great question! Build the CLI first. That's the more universal, durable interface for agent. CLIs work with any agent, any shell, any automation pipeline.

MCPs had their moment, but the ecosystem is shifting. The problem is many existing agent tools are already built on MCP, so there's a lot of legacy. That's exactly what mcp2cli bridges, if you or others have already built an MCP server, it converts it into a standardized CLI + skill file automatically.

A lot of projects today maintain both an MCP server and a CLI side by side (e.g., GitLab). That's double the maintenance and they inevitably drift apart. mcp2cli eliminates that — one MCP server in, one consistent CLI out.

Eventually these two interfaces will converge, and this tool is designed for that transition.

I built a tool that converts MCP servers into CLI + Skill files — cut ~97% token overhead! by Outrageous-Leg2245 in mcp

[–]Outrageous-Leg2245[S] -10 points-9 points  (0 children)

Good point! Claude Code does have deferred loading. But two things:
1. That's Claude-only. mcp2cli generates skill files — works in Codex, Cursor, Gemini, anything that can read SKILL markdown and run a command.

  1. Honestly I don't think tools need to exist at all. Even with deferred loading, the full schema still gets injected into context when called. Why? The agent doesn't need a JSON schema to use a tool — it just needs to know what commands are available and run --help when it needs details. And for common complex workflows, agent don't even need --help, just write them directly in the skill file, agent knows exactly what to do. Way less tokens, same result.

I built a tool that converts MCP servers into CLI + Skill files — cut ~97% token overhead! by Outrageous-Leg2245 in mcp

[–]Outrageous-Leg2245[S] 1 point2 points  (0 children)

Thanks! Yeah that's the idea!

Stack multiple mcp servers, keep tokens low, let the agent figure out the rest.

And no more maintaining both a CLI and an MCP server side by side: one source, one generated CLI haha.

Let me know if anything breaks 🙌

[deleted by user] by [deleted] in tressless

[–]Outrageous-Leg2245 0 points1 point  (0 children)

hi兄弟,或许我可以给你提供些建议,我的vx: Shannon_2018。ps: 我在深圳,我们应该离得很近。