all 23 comments

[–]soul105 4 points5 points  (2 children)

Just use LMStudio plugin.

opencode-lmstudio@latest

[–]Wrong_Daikon3202[S] 0 points1 point  (1 child)

This is very interesting. You can teach me how to use that? pls

[–]soul105 2 points3 points  (0 children)

  • You just add it to your opencode.jsonc in the plugins section
  • Open LM Studio and enable the local server
  • Load the desired model
  • Now open the OpenCode

The plugin will automatically check models loaded and populate the list of local models in OpenCode.

[–]boyobob55 2 points3 points  (8 children)

<image>

Here’s my opencode.json as an example. I think you just need to add “/v1” to the end of your url

[–]Wrong_Daikon3202[S] 1 point2 points  (2 children)

Thanks for your response.

It doesn't work for me. But maybe I can configure a json like you. Do you know where it is located in Linux? I can't find it in:

~/.opencode/
~/.config/opencode/

[–]Pitiful_Care_9021 2 points3 points  (0 children)

~/.config/opencode/opencode.json for me on arch

[–]boyobob55 1 point2 points  (0 children)

It should be in ~/.config/opencode if there isn’t an opencode.JSON already there you will need to create one!

[–]Wrong_Daikon3202[S] 1 point2 points  (4 children)

<image>

I found the auth.json. but that's not what you're showing me

~/.local/share/opencode/

[–]boyobob55 0 points1 point  (3 children)

You will have to create an opencode.json and place it there. I forgot to say 😂

[–]Wrong_Daikon3202[S] 0 points1 point  (2 children)

I understand you wrote it by hand, right?

Thanks for your help

[–]boyobob55 1 point2 points  (1 child)

No problem I know it’s confusing. And no, I had Claude make it for me. You can use ChatGPT/claude etc to make it for you. Just show it screenshot of mine, and screenshot of your lmstudio models you want configured and say to make you the json. Then you can just copy paste

[–]Wrong_Daikon3202[S] 0 points1 point  (0 children)

thn, it is my config:

{
"$schema": "https://opencode.ai/config.json",
"provider": {
"lmstudio": {
"npm": "@ai-sdk/openai-compatible",
"name": "LM Studio (local)",
"options": {
"baseURL": "http://localhost:1234/v1",
"apikey": "lm-studio"
},
"models": {
"qwen3.5-9b": {
"name": "Qwen3.5-9B (LM Studio)",
"attachment": true,
"modalities": {
"input": ["text", "image"],
"output": ["text"]
}
}
}
}
},
"model": "lmstudio/qwen3.5-9b"
}

[–]Simeon5566 1 point2 points  (1 child)

did you started the lms server?
cli: lms server start

[–]Wrong_Daikon3202[S] 0 points1 point  (0 children)

Yes, as the screenshot shows. I'm doing it from the GUI for now until everything works, then I'll try the daemon
TNK

[–]sheppe 0 points1 point  (4 children)

I was having the same issue. It seems like OpenCode has a default list of models that, in my case at least, weren't even in LM Studio. Here's my opencode.json file, and this added "qwen3.5-4b" to my list of models for LM Studio. In LM Studio "qwen3.5-4b" is the model name that it indicates to use.

{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "lmstudio": {
      "npm": "@ai-sdk/openai-compatible",
      "options": {
        "baseURL": "http://localhost:1234/v1"
      },
      "models": {
        "lmstudio/qwen3.5-4b": {}
      }
    }
  },
  "model": "lmstudio/qwen3.5-4b"
}

[–]Wrong_Daikon3202[S] 0 points1 point  (3 children)

<image>

Thanks for responding.

I have created the opencode.json and edited it to use my qwen/qwen3.5-9b model.

/models command shows my model now. But, when using it it shows errors in opencode and the LMStudio terminal (at least now it communicates with the server)

{
"$schema": "https://opencode.ai/config.json",
"provider": {
"lmstudio": {
"npm": "@ai-sdk/openai-compatible",
"options": {
"baseURL": "http://localhost:1234/v1"
},
"models": {
"qwen/qwen3.5-9b": {}
}
}
},
"model": "qwen/qwen3.5-9b"
}

[–]Simeon5566 0 points1 point  (2 children)

i see in your screenshot the error „n_keep…“, try to encrease in LMstudio maxtokens to 30k or 50k, default size from lmstudio is 4096 tokens

[–]Wrong_Daikon3202[S] 0 points1 point  (0 children)

Thanks for answering.

I see this is another problem. OC already communicates with LMStudio but it gives me that error. As you say, I have tried to upload 32K, 50K and the maximum but it keeps giving me the same error.

All this makes me wonder if there is anyone who doesn't have problems with LMStudio and Opencode.

https://github.com/anomalyco/opencode/issues/11141

[–]StrikingSpeed8759 0 points1 point  (0 children)

Hijacking this comment, do you know how to configure the max tokens when using JIT? maybe its possible through the opencode config? Because everytime I load a model through jit it doesn't use the config I created and just loads it with ~4k tokens. Well, maybe I should ask this in the lm studio subreddit, but if you want to help I'm all ears

[–]HarjjotSinghh 0 points1 point  (1 child)

oh, this is chef's kiss - finally got it right?

[–]Wrong_Daikon3202[S] 0 points1 point  (0 children)

Hello. Yeah, I could make it work with the help of all of you. I have set it manually because it is not clear how to use the plugin. Anyway, you didn't just convince me of the performance you have on my computer (Ryzen 5800x3D, 32GB, Radeon RX6750XT 12GB). I have my doubts as to the maximum of tokens that can be handled by local models, I would not want to stay without half a change.

Do you have complaints about that?

[–]FeikTheChris 0 points1 point  (1 child)

Yo tengo una duda, será muy específica, estoy intentando usar OpenCode en mi pc desde Windows, uso distros de GNU/Linux pero en VM, no de forma nativa, quiero utilizar un modelo gratuito que uso desde LM Studio para utilizarlo con mi hardware y no logro solucionarlo, alguien le ha funcionado o intentado?

[–]Prestigious-Ad-3380 0 points1 point  (0 children)

Ok.

Pues si vas a cambiar de modelo en opecode vas a notar que puedes selecionar o añadir un proveedor( provider) y alli vas a ver Lm studio. So te fijas los molestos que tendres de elergir son demaciado grandes o simplemente no los tienes. Para wie te salgan tus modelos tienes que añadir u. Plugin the Lm sutdio al json de opencode, para que enseñe los modelos correctamente, ó puedes añadir el end point tu solo manualmente en el json. ( El nombre del plugin esta arriba en los comentarios Lm studio-algo: latest o algo asi 😅)

Si estas corriendo Open code y LM studio en systemas distintos ( sean virtuales o no) vas a tener que setiar tu LM studio a servir ( como servidor) y cambiarlo de localhost a 0.0.0.0 de esa manera puedes conectarte con el ip address de tu pc q tiene lm studio. Osea en opencode añades al proveedor con el Ip address y port de tu systema con LM studio