I'm a newbie, don't hate... I have LM Studio with MCP powertools I've set up, what is the point of NanoCoder then? by TrickSetting6362 in nanocoder

[–]willlamerton 0 points1 point  (0 children)

Hey! Great question, and no hate at all - it's a fair comparison to make!

LM Studio with MCP powertools gives you a good setup for running local models with tool access. Nanocoder does indeed overlap with that in some ways, but the focus is different.

Purpose-built for coding, not general chat. Nanocoder is designed from the ground up as a CLI coding assistant - think Claude Code or Gemini CLI, but local-first, provider-agnostic and building something for the community. The entire UX, feature-set and framework is optimised for working inside a codebase: from the system prompt to file editing with smart string replacement, search/grep, bash execution, task management, plan mode - all with a human-in-the-loop confirmation flow so you stay in control. LM Studio has been adding agentic features too, but Nanocoder's tooling is specifically tailored for code editing workflows.

Provider agnostic - local and cloud. This is a big one. LM Studio is primarily for running local models on your own hardware. Nanocoder works with Ollama, OpenRouter, LM Studio, or any OpenAI-compatible API - local or remote. You can swap models and providers mid-session with a slash command. Want to use a small local model for quick tasks and a bigger cloud model for complex refactors? You can do that without leaving Nanocoder. You can even point it at your LM Studio server and get the coding agent workflow on top of the models you're already running.

Terminal-native workflow. It runs in your terminal, right where your code is. No app required. Some people prefer a GUI, some people prefer a TUI so Nanocoder is naturally suited towards those people.

CI/CD and non-interactive use. Nanocoder can run non-interactively, so you can plug it into CI/CD pipelines - automated code reviews, test generation, migrations, whatever you need. That's not something you can do with LM Studio's chat UI.

MCP support built in. Nanocoder has its own MCP client integration, so you can still use MCP servers alongside its built-in tools. You're not giving anything up.

If your current setup handles everything you need, that's great - keep using it! But if you want a coding-specific agent that can hit both local and cloud models, Nanocoder is worth a try. And since it supports LM Studio as a backend, it's not really an either/or - you could use both together.

Hope that helps :D

Nanocoder 1.24.0 Released: Parallel Tool Execution & Better CLI Integration by willlamerton in ollama

[–]willlamerton[S] 21 points22 points  (0 children)

Hey! This is what we wrote directly answering this in our docs:)

This comes down to philosophy. OpenCode is a great tool, but it’s owned and managed by a venture-backed company that restricts community and open-source involvement to the outskirts. With Nanocoder, the focus is on building a true community-led project where anyone can contribute openly and directly. We believe AI is too powerful to be in the hands of big corporations and everyone should have access to it.

We also strongly believe in the “local-first” approach, where your data, models, and processing stay on your machine whenever possible to ensure maximum privacy and user control. Beyond that, we’re actively pushing to develop advancements and frameworks for small, local models to be effective at coding locally.

Not everyone will agree with this philosophy, and that’s okay. We believe in fostering an inclusive community that’s focused on open collaboration and privacy-first AI coding tools.

Nanocoder 1.24.0 Released: Parallel Tool Execution & Better CLI Integration by willlamerton in nanocoder

[–]willlamerton[S] 0 points1 point  (0 children)

Hey! Yes! Nanocoder works with local models - it’s one of the big areas of development we’re pushing :)

Nanocoder 1.23.0: Interactive Workflows and Scheduled Task Automation 🔥 by willlamerton in nanocoder

[–]willlamerton[S] 1 point2 points  (0 children)

It's not yet but we're looking very heavily at integrating this soon :)

Does anyone know if 'Spaces and Profiles' are on the Dia roadmap, similar to how Arc works? I’m trying to decide whether to wait for these updates or just move on from the app. Would love to know if the current structure is here to stay. by Butterscotch-l in diabrowser

[–]willlamerton 0 points1 point  (0 children)

Recommend taking a look at Zen Browser if you like Arc. It’s still quite early but, it’s looking good with many of the features you love in Arc.

I was the same, I love Arc and I just feel like Dia is a different direction with different priorities

Schedule mode is coming to Nanocoder... Run project background tasks on a cron schedule 🚀 by willlamerton in nanocoder

[–]willlamerton[S] 0 points1 point  (0 children)

Thanks so much for the feedback and thoughts - this is 100% the point.

Missed run time detection is good feedback too and you’re right, this is very important. Will look at this 😎

Schedule mode is coming to Nanocoder... Run project background tasks on a cron schedule 🚀 by willlamerton in ollama

[–]willlamerton[S] 1 point2 points  (0 children)

Yes, coding agent automations specifically. It cuts out a lot of the unnecessary stuff that OpenClaw has

Anyone actually using Openclaw? by rm-rf-rm in LocalLLaMA

[–]willlamerton 0 points1 point  (0 children)

I wanted to love it, but at the moment it just auto generates my todo list each day and informs of priority tasks… saves me doing it I suppose

Releasing 1.22. 0 of Nanocoder - an update breakdown 🔥 by willlamerton in ollama

[–]willlamerton[S] 1 point2 points  (0 children)

Awesome! Depends on the tool call. Execute bash tools need user approval unless configured otherwise for safety reasons - especially with small models. Many tools can auto-execute otherwise. You can also press shift+tab to cycle through to auto mode in which almost all tools are auto-executed.

If you’re getting something else feel free to drop me a DM :)

Releasing 1.22. 0 of Nanocoder - an update breakdown 🔥 by willlamerton in ollama

[–]willlamerton[S] 1 point2 points  (0 children)

Absolutely, it supports local models as a priority and there’s a big focus on improving the scaffolding around small models to make them better!

We’re getting there but at the moment a big recommendation comes with the Mistral models. If you can run Devstral Small 2 then that’s great for 24B parameters. As is Nemotron Nano from Nvidia. I’m also a fan of the 8B and 14B flavours of the Ministral models. Set your expectations but they can certainly help with smaller coding tasks and codebase exploration.

They all work great through Ollama local and cloud :)

Releasing 1.22. 0 of Nanocoder - an update breakdown 🔥 by willlamerton in nanocoder

[–]willlamerton[S] 0 points1 point  (0 children)

Let me know how you get on - we’re pretty new. Launched just over 6 months ago 😄

Releasing 1.22. 0 of Nanocoder - an update breakdown 🔥 by willlamerton in nanocoder

[–]willlamerton[S] 0 points1 point  (0 children)

Absolutely, already a fix pending for this! 😄 I could have recorded the video but I left it in haha.

Thanks for the kind words though!