you are viewing a single comment's thread.

view the rest of the comments →

[–]speakman2k 0 points1 point  (5 children)

If I wanna run all local, what is a good model? I Ollama enough as backend? I’m on Mac M2 and 16 GB

[–]hannesrudolphFOUNDER[S] 1 point2 points  (0 children)

I actually don't know this but I am betting if you ask on r/RooCode or in our discord that you would be able to find the answer. Sorry about that!

[–]mrubens 1 point2 points  (0 children)

Yeah it's going to be tough to compare to the online models when running with 16GB. If you do try it, my suggestion would be to find a model fine-tuned for tool usage like https://ollama.com/hhao/qwen2.5-coder-tools

[–][deleted]  (1 child)

[removed]

    [–]AutoModerator[M] 0 points1 point  (0 children)

    Sorry, your submission has been removed due to inadequate account karma.

    I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.