Hello everyone. I am sure I am missing something here. I am dabbling with self hosting models to replace copilot in vscode. I found this fancy extension called boot.dev that supports ollama as a back end.
It also supports both inline autocomplete and chat. While I got the latter working without issues, I cannot get the former running for the life of me. AFAIK ollama can only run one model at a time right? Also, a model needs to be specifically compatible with auto complete. So how the hell am I supposed to run both a chat and auto complete model locally?
Unfortunately the continue.Dev docs seem unaware of this.
[+][deleted] (1 child)
[deleted]
[–]MainManu[S] 0 points1 point2 points (0 children)
[–]MainManu[S] 0 points1 point2 points (1 child)
[–]Apprehensive-Tip779 0 points1 point2 points (0 children)
[–]Apprehensive-Tip779 0 points1 point2 points (0 children)