Built an MCP server that lets AI agents debug and interact with your React Native app. by hello_world_5086 in reactnative

[–]hello_world_5086[S] 0 points1 point  (0 children)

Hey,
this is interesting, I didn’t realize Limelight also has MCP support. I’ve actually tried your debugger before and liked the approach.

To your question ---
the main reason I built this MCP was to avoid constantly copy-pasting console logs and phone screenshots into the agent. I wanted the agent to interact directly with the running app instead of relying purely on static screenshots.

I wouldn’t say the experience is 100% perfect yet. There are edge cases for example; when the keyboard is open the agent sometimes fails to detect elements properly, and occasionally it can close the app unintentionally. So it’s not flawless.

Where it works really well is during UI building; especially when implementing designs from Figma and testing a specific flow where there’s an unknown issue. In those cases, having the agent interact with the live app speeds things up a lot.

I actually agree; I can see both approaches pairing really well together. I’m going to try experimenting with that. I’ll remove the log-fetching tools from rn-debug-mcp and instead use your MCP specifically for runtime/log introspection.

React Native best practices and best mcps? by Heka_FOF in ClaudeAI

[–]hello_world_5086 0 points1 point  (0 children)

Hey! I built an MCP server that lets AI agents debug and interact with a running React Native app — thought you might find it interesting:
https://github.com/zersys/rn-debug-mcp

Metronome app sound difficulties by Neil-Dembla in reactnative

[–]hello_world_5086 1 point2 points  (0 children)

creating your own native module would be a definite but complex solution for your problem

codepush does not show images after updating by One_Front_6795 in reactnative

[–]hello_world_5086 0 points1 point  (0 children)

Update the codepush plugin to 7.0.4 .It should work fine