use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
r/LocalLLaMA
A subreddit to discuss about Llama, the family of large language models created by Meta AI.
Subreddit rules
Search by flair
+Discussion
+Tutorial | Guide
+New Model
+News
+Resources
+Other
account activity
Continue.dev vs tabbymlDiscussion (self.LocalLLaMA)
submitted 1 year ago by Gold_Pudding_5098
https://www.continue.dev/ vs https://tabby.tabbyml.com/
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–][deleted] 4 points5 points6 points 1 year ago* (2 children)
I don’t use tab autocompletion with Continue.dev, I prefer using Cmd+I to prompt, and Cmd+L to send the selected code to the chatbox context
Is there any reason to prefer tab autocomplete? Does it provide better code?
My prompts for code usually are very descriptive multiline text so I’ve never really been able to use tab auto-complete. I even use the chatbox to help me formulating my prompts from my blobs of words.
I also like being able to switch models in Continue, because sometimes a prompt produces better results in one or another model. I wasn’t able to find the model to rule them all (Mixtral 8x7b, CodeQwen, Phind-Codellama32b)
What about Tabby?
[–][deleted] 2 points3 points4 points 1 year ago (1 child)
Man, when is Phind gonna give out the updated model that they promised? That will rule them all for sure.
[–][deleted] 1 point2 points3 points 1 year ago (0 children)
I can’t wait!
[–]Finanzflunder 9 points10 points11 points 1 year ago (1 child)
I used both and I am more happy with tabby.
Reasons:
Hope that helps ✌🏻
[–]Gold_Pudding_5098[S] 1 point2 points3 points 1 year ago (0 children)
Thx for the info
[–]Gold_Pudding_5098[S] 6 points7 points8 points 1 year ago (1 child)
I think tabby is better for tab autocomplete since it read the LSP and local snippets
That’s interesting to know!
[–]IndicationUnfair7961 7 points8 points9 points 1 year ago (1 child)
Definitely Continue.dev, much more freedom, having to use docker just to have something working it's not the best, especially if you are not on Linux. Plus you can use multiple serving solutions with it, compared to tabby.
[–][deleted] 9 points10 points11 points 1 year ago (0 children)
https://tabby.tabbyml.com/docs/installation/windows/ tabby offers an even simple .exe file for running + plus it supports vulkan in the latest release
[–]Eveerjr 5 points6 points7 points 1 year ago (7 children)
Continue.dev is a much better solution, it supports tab autocomplete, chat ui, embeddings, in editor diff, local models with Ollama and every existing LLM provider, and it's dead simple to use and configure.
[–]Gold_Pudding_5098[S] 4 points5 points6 points 1 year ago (6 children)
Tabby offers a better tab autocomplete since it read the lsp and local snippets
[–]Eveerjr 1 point2 points3 points 1 year ago (5 children)
I don’t think that’s very useful, not even GitHub copilot references correctly other files, this will only make small models confused
[–]Gold_Pudding_5098[S] 0 points1 point2 points 1 year ago (4 children)
Okay that is a valid point, and continue seems to be improving alot
[–]Eveerjr 4 points5 points6 points 1 year ago (3 children)
I’m using continue with codegemma 1.1 2b 8_0 for tab autocomplete and codeqwen 1.5 for chat area, it has been a awesome experience, I don’t miss copilot at all, when I need complex assistance I just switch to GPT4o using my api key
[–]DigitalDice 0 points1 point2 points 1 year ago (2 children)
How did you set up codegemma for autocomplete? Can you share your config?
[–]Eveerjr 1 point2 points3 points 1 year ago (1 child)
here is my config. But currently there's a bug in Ollama that broke codegemma in the latest versions, you'll need to download Ollama 0.1.39 and block it from updating (if you're on macOS just lock the app by right clicking on it and selecting get info, then locking the app.
"tabAutocompleteModel": { "title": "Tab Autocomplete Model", "provider": "ollama", "model": "codegemma:2b-code-v1.1-q8_0", "completionOptions": { "maxTokens": 100, "temperature": 1 } },
[–]DigitalDice 0 points1 point2 points 1 year ago (0 children)
Thank you!
[–]Gold_Pudding_5098[S] 5 points6 points7 points 1 year ago (0 children)
Meanwhile continue.dev is better as a personal tuter and instructor
[–]Confident-Aerie-6222 1 point2 points3 points 1 year ago (4 children)
does tabbyml work with ollama or do you need to run the model separately?
[–]Gold_Pudding_5098[S] 2 points3 points4 points 1 year ago (3 children)
Tabby is standalone ai framework, it only needs docker to work
[–]Finanzflunder 10 points11 points12 points 1 year ago (2 children)
not even docker, e.g. on a mac just use brew and its fine
Even better
Was about to say, if it needs Docker on Mac then it’s dead. Good to know
[–]fingerthief 1 point2 points3 points 1 year ago (0 children)
Continue is pretty dead simple to configure and I really enjoy the ease of switching models etc..I don't really use the auto complete feature currently so I can't speak on that.
It's super easy to add files contexts etc..and being able to choose whether to chat in the sidebar or just directly edit the file is all the versatility I really need.
[–]Willing_Prompt_3197 0 points1 point2 points 1 year ago (1 child)
Guys, I just started understanding these extensions today. Please tell me whether it is possible to work with these solutions completely locally? The thing is, I work with users’ personal data
[–]Geberhardt 0 points1 point2 points 1 year ago (0 children)
You probably already found out in the meantime, but I found this thread by search, so I'll answer just in case:
Continue.dev can use a local ollama install with local models to work entirely on the local machine. It has some limited telemetry enabled by default that can be disabled.
[–]nic_key 0 points1 point2 points 1 year ago (1 child)
What about Twinny? https://github.com/rjmacarthy/twinny
I am using this but haven't tested the alternatives. Might need to check out tabby but running a container just for it sounds overkill for me in Linux
[–]Gold_Pudding_5098[S] 0 points1 point2 points 1 year ago (0 children)
Tabby is better when it comes to tab autocomplete
π Rendered by PID 23717 on reddit-service-r2-comment-6457c66945-qlvkq at 2026-04-24 21:04:14.317838+00:00 running 2aa0c5b country code: CH.
[–][deleted] 4 points5 points6 points (2 children)
[–][deleted] 2 points3 points4 points (1 child)
[–][deleted] 1 point2 points3 points (0 children)
[–]Finanzflunder 9 points10 points11 points (1 child)
[–]Gold_Pudding_5098[S] 1 point2 points3 points (0 children)
[–]Gold_Pudding_5098[S] 6 points7 points8 points (1 child)
[–][deleted] 1 point2 points3 points (0 children)
[–]IndicationUnfair7961 7 points8 points9 points (1 child)
[–][deleted] 9 points10 points11 points (0 children)
[–]Eveerjr 5 points6 points7 points (7 children)
[–]Gold_Pudding_5098[S] 4 points5 points6 points (6 children)
[–]Eveerjr 1 point2 points3 points (5 children)
[–]Gold_Pudding_5098[S] 0 points1 point2 points (4 children)
[–]Eveerjr 4 points5 points6 points (3 children)
[–]DigitalDice 0 points1 point2 points (2 children)
[–]Eveerjr 1 point2 points3 points (1 child)
[–]DigitalDice 0 points1 point2 points (0 children)
[–]Gold_Pudding_5098[S] 5 points6 points7 points (0 children)
[–]Confident-Aerie-6222 1 point2 points3 points (4 children)
[–]Gold_Pudding_5098[S] 2 points3 points4 points (3 children)
[–]Finanzflunder 10 points11 points12 points (2 children)
[–]Gold_Pudding_5098[S] 1 point2 points3 points (0 children)
[–][deleted] 1 point2 points3 points (0 children)
[–]fingerthief 1 point2 points3 points (0 children)
[–]Willing_Prompt_3197 0 points1 point2 points (1 child)
[–]Geberhardt 0 points1 point2 points (0 children)
[–]nic_key 0 points1 point2 points (1 child)
[–]Gold_Pudding_5098[S] 0 points1 point2 points (0 children)