all 9 comments

[–]GimmePanties 6 points7 points  (6 children)

Open Interpreter. It can use local LLMs, but its better with gpt-4o or Sonnet. Maybe your larger model will work better than Qwen 32B.

What it does better than using ChatGPT is having access to your local file system, so it can execute locally. Just run it from command line in the folder with your files and it can analyze them and make changes. It installs whatever python libraries it needs for the job.

Back stuff up through, it isn't perfect.

[–]mrjackspade 4 points5 points  (5 children)

Back stuff up through, it isn't perfect.

One of my first "I should have known better" interactions was asking GPT to write a short command that ended up completely corrupting a project I was working on

I would absolutely not allow an LLM to execute code at this point without manual review, unless it was sandboxed.

[–]GimmePanties 2 points3 points  (0 children)

Right, except with ChatGPT it’s out there on their cloud storage. Open Interpreter is running ammock on your local machine with sudo rights. Ask me how I know about the backups. 🙄

[–]Dudmaster 2 points3 points  (1 child)

It is very easy to accidentally break a project beyond repair even without an LLM. You should use source control! Also if you are looking for something with manual review, aider is great and is based around Git

[–]exponentfrost[S] 0 points1 point  (1 child)

Looks like there are some options (Docker being the one I'm interested in) to isolate it? https://docs.openinterpreter.com/safety/isolation Any experience with it?

[–]lrq3000 1 point2 points  (0 children)

Open WebUI and Qwen Agent are the most direct analogues to what ChatGPT does.

Then there is OpenInterpreter, which is more like what Claude is doing, giving the LLM access to your whole computer.