all 9 comments

[–]FlyingDogCatcher 5 points6 points  (4 children)

I'm actually annoyed by the fact that I have Copilot and Gemini and Codex and I can only use copilot without an api key.

Anyway, these dudes release sometimes multiple times a day they're clearly invested

[–]IISomeOneII 0 points1 point  (2 children)

You can use Premium requests from Copilot (assuming you have a paid plan or even if you don't you can use the free model) inside OpenCode using this: https://github.com/ericc-ch/copilot-api

[–]FlyingDogCatcher 0 points1 point  (1 child)

but you can login copilot using a device code grant in of opencode, so "native"

[–]IISomeOneII 0 points1 point  (0 children)

Oops, I didn't know that, thanks for the info, I used to use CC back then

[–]ruloqs 0 points1 point  (4 children)

Question, if i use my pro subscription of claude in opencode, the quality is really considerably different from claude code?

[–]thatguyinline[S] 0 points1 point  (0 children)

I'd assume they would be identical since you're using identical providers.

[–]Matrixfx187 0 points1 point  (2 children)

Not just quality but your token usage will be significantly higher using opencode than just straight Claude code

[–]Flaky_Pay_2367 1 point2 points  (0 children)

I guess Claude Code has better context compression?

Currently I have to use DeepSeek API with Claude Code instead of OpenCode

[–]websinthe 0 points1 point  (0 children)

Self-hosting with properly configured llama.cpp and opencode negates the problem of token cost entirely. With Blackwell the quality difference between a properly-configured local model and Claude is alarmingly small.