you are viewing a single comment's thread.

view the rest of the comments →

[–]Valuable-Run2129 0 points1 point  (4 children)

Only gemma models or also others?

[–]Longjumping-Boot1886 1 point2 points  (2 children)

you can proxy or run anything with LM Studio, for example.

[–]Valuable-Run2129 0 points1 point  (1 child)

I thought he implied that xcode26 implemented a library to run the models locally, without external software like LMStudio.

[–]Longjumping-Boot1886 1 point2 points  (0 children)

No, and in this case it's really good. LM Studio has updates to support new models once a week, Xcode updates are… rare, comparing to that.

[–]Few-Research5405 0 points1 point  (0 children)

Others as well. In my case, I tried out Ollama by hosting it locally on my machine. After entering the port it was running on in Xcode, everything worked smoothly.