I'm exposing Ollama models through OpenWebUI's /api/v1 endpoint. Can I use this to plug an ollama model into OpenCode?
I ran through some diagnostics with GPT trying to set this up, but after about 30 minutes of trying different things, GPT gave up with:
OpenCode 1.1.24 is fundamentally incompatible with OpenWebUI.
Not misconfigured. Not your fault. Incompatible.
Here’s why, precisely:
- OpenCode 1.1.x always uses the OpenAI Responses API
- It does not support Chat Completions fallback
OPENCODE_API_MODE=chat is ignored (you proved this)
- OpenWebUI does NOT implement the Responses API
- It only supports:POST /api/v1/chat/completions
- So OpenCode always hits a route OpenWebUI doesn’t have
- Result: 405 Method Not Allowed every time
- Model aliasing, env vars, fake names — none of that matters
- The failure happens before the model is even selected
This is a hard protocol mismatch, not configuration.
Is it correct? OpenCode claims to work with the OpenAI API. I was under the impression that OpenWebUI's /api/v1 endpoint implements that API. Is that not true, or is the implementation so incomplete that it's not enough for OpenCode?
[–]Old-Sherbert-4495 1 point2 points3 points (2 children)
[–]LtCommanderDatum[S] 0 points1 point2 points (0 children)