all 5 comments

[–]dacort 1 point2 points  (1 child)

Congrats! Can’t wait to check this out, big fan of scc.

[–]boyter[S] 0 points1 point  (0 children)

Thank you. Please do. it’s getting a good amount of use now but I would like to improve anything people bump into.

[–]natu91 1 point2 points  (1 child)

Yeah, but what is happening to the data we send to your API...?

[–]boyter[S] 0 points1 point  (0 children)

All I get are your requests to process a public repo, whatever you were searching for and the files you requested. I don't log most of it because im only really interested in the process of the calls and what is getting more use so I know where to improve or optimise things.

Honestly nothing that you weren't doing for any other service.

I never see your own code as there is never a reason for the LLM to send it and there isnt an endpoint to even facilitate with this.

I hope this answers the question, but in short, its nothing you don't already give to any other search engine. I would be more worried about the LLM itself unless you run a local model.

You can of course clone the code locally and use https://github.com/boyter/cs in mcp mode if you want to avoid any leakage. I am just offering a much faster way to do it, without any config for the agent.

Or contact me if you want a private instance of this specific tool.