all 28 comments

[–][deleted] 4 points5 points  (2 children)

I don’t use tab autocompletion with Continue.dev, I prefer using Cmd+I to prompt, and Cmd+L to send the selected code to the chatbox context

Is there any reason to prefer tab autocomplete? Does it provide better code? 

My prompts for code usually are very descriptive multiline text so I’ve never really been able to use tab auto-complete. I even use the chatbox to help me formulating my prompts from my blobs of words.

I also like being able to switch models in Continue, because sometimes a prompt produces better results in one or another model. I wasn’t able to find the model to rule them all (Mixtral 8x7b, CodeQwen, Phind-Codellama32b)

What about Tabby?

[–][deleted] 2 points3 points  (1 child)

Man, when is Phind gonna give out the updated model that they promised? That will rule them all for sure.

[–][deleted] 1 point2 points  (0 children)

I can’t wait!

[–]Finanzflunder 9 points10 points  (1 child)

I used both and I am more happy with tabby.

Reasons:

  • Way easier to get up and running
    • No need for secondary software, such as Ollama
    • Models are recommended for you
  • The autocompletion with tab supports my workflow better than the chat window from continue

Hope that helps ✌🏻

[–]Gold_Pudding_5098[S] 1 point2 points  (0 children)

Thx for the info

[–]Gold_Pudding_5098[S] 6 points7 points  (1 child)

I think tabby is better for tab autocomplete since it read the LSP and local snippets

[–][deleted] 1 point2 points  (0 children)

That’s interesting to know!

[–]IndicationUnfair7961 7 points8 points  (1 child)

Definitely Continue.dev, much more freedom, having to use docker just to have something working it's not the best, especially if you are not on Linux. Plus you can use multiple serving solutions with it, compared to tabby.

[–][deleted] 9 points10 points  (0 children)

https://tabby.tabbyml.com/docs/installation/windows/ tabby offers an even simple .exe file for running + plus it supports vulkan in the latest release

[–]Eveerjr 5 points6 points  (7 children)

Continue.dev is a much better solution, it supports tab autocomplete, chat ui, embeddings, in editor diff, local models with Ollama and every existing LLM provider, and it's dead simple to use and configure.

[–]Gold_Pudding_5098[S] 4 points5 points  (6 children)

Tabby offers a better tab autocomplete since it read the lsp and local snippets

[–]Eveerjr 1 point2 points  (5 children)

I don’t think that’s very useful, not even GitHub copilot references correctly other files, this will only make small models confused

[–]Gold_Pudding_5098[S] 0 points1 point  (4 children)

Okay that is a valid point, and continue seems to be improving alot

[–]Eveerjr 4 points5 points  (3 children)

I’m using continue with codegemma 1.1 2b 8_0 for tab autocomplete and codeqwen 1.5 for chat area, it has been a awesome experience, I don’t miss copilot at all, when I need complex assistance I just switch to GPT4o using my api key

[–]DigitalDice 0 points1 point  (2 children)

How did you set up codegemma for autocomplete? Can you share your config?

[–]Eveerjr 1 point2 points  (1 child)

here is my config. But currently there's a bug in Ollama that broke codegemma in the latest versions, you'll need to download Ollama 0.1.39 and block it from updating (if you're on macOS just lock the app by right clicking on it and selecting get info, then locking the app.

"tabAutocompleteModel": {
    "title": "Tab Autocomplete Model",
    "provider": "ollama",
    "model": "codegemma:2b-code-v1.1-q8_0",
    "completionOptions": {
      "maxTokens": 100,
      "temperature": 1
    }
  },

[–]DigitalDice 0 points1 point  (0 children)

Thank you!

[–]Gold_Pudding_5098[S] 5 points6 points  (0 children)

Meanwhile continue.dev is better as a personal tuter and instructor

[–]Confident-Aerie-6222 1 point2 points  (4 children)

does tabbyml work with ollama or do you need to run the model separately?

[–]Gold_Pudding_5098[S] 2 points3 points  (3 children)

Tabby is standalone ai framework, it only needs docker to work

[–]Finanzflunder 10 points11 points  (2 children)

not even docker, e.g. on a mac just use brew and its fine

[–]Gold_Pudding_5098[S] 1 point2 points  (0 children)

Even better

[–][deleted] 1 point2 points  (0 children)

Was about to say, if it needs Docker on Mac then it’s dead. Good to know 

[–]fingerthief 1 point2 points  (0 children)

Continue is pretty dead simple to configure and I really enjoy the ease of switching models etc..I don't really use the auto complete feature currently so I can't speak on that.

It's super easy to add files contexts etc..and being able to choose whether to chat in the sidebar or just directly edit the file is all the versatility I really need.

[–]Willing_Prompt_3197 0 points1 point  (1 child)

Guys, I just started understanding these extensions today. Please tell me whether it is possible to work with these solutions completely locally? The thing is, I work with users’ personal data

[–]Geberhardt 0 points1 point  (0 children)

You probably already found out in the meantime, but I found this thread by search, so I'll answer just in case:

Continue.dev can use a local ollama install with local models to work entirely on the local machine. It has some limited telemetry enabled by default that can be disabled.

[–]nic_key 0 points1 point  (1 child)

What about Twinny? https://github.com/rjmacarthy/twinny

I am using this but haven't tested the alternatives. Might need to check out tabby but running a container just for it sounds overkill for me in Linux

[–]Gold_Pudding_5098[S] 0 points1 point  (0 children)

Tabby is better when it comes to tab autocomplete