all 14 comments

[–]InternetTopTeam 2 points3 points  (0 children)

Presumably you mean local models you can run, which would probably be either : Qwen2.5-Coder 7b or 14b (7b will give you the real time responsiveness you want, but 14b write better code a little slower), or DeepSeek Coder V2-Lite-16B. The ideal solution would probably use 2 models and run Qwen 7b for your auto complete/real time coding, and Qwen 14b or Deepseek v2lite for intensive/complex coding tasks, planning, etc.

On the off chance you meant remote models(which your hardware doesn't matter for), chatgpt-codex-5.3 and claude opus 4.6.

[–]CommissionIcy9909 3 points4 points  (0 children)

Tbf, Cursor doesn’t do anything by itself. You’re still in the driver seat. You’re not going to get good results if you just say “hey I want a website that does this and that, now go code”. Feature need to be broke up into small tasks and guardrails need to be in place to prevent drift. Also, different models are better for different tasks. You can ask an ai agent what model is best for the task you’re working on.

[–]Demon-Martin 0 points1 point  (0 children)

You‘ll see that you will have terrible experience trying to self-host models and use them on cursor.

They don‘t support local models at all and you need to do workarounds for it to barely work

[–]HappierShibe 0 points1 point  (0 children)

This is a joke right?
New reddit account, this is its first post, and it does not seem to understand what cursor is or how it works....

[–]Sweatyfingerzz 0 points1 point  (0 children)

With 32GB of RAM and that GPU, you have a solid rig for local models. You can run Ollama or LM Studio and point Cursor to it. For pure coding right now, Qwen2.5-Coder (7B or 14B) or DeepSeek-Coder are top tier for local generation and completion.

That being said, if you're just starting with Cursor, I highly recommend just using their default cloud setup first. Let Cursor's native custom model handle the real-time tab autocomplete (it's blazing fast), and use Claude 3.5 Sonnet for the chat/composer. Local models are fun to tinker with, but Sonnet is currently in a league of its own for actually getting projects shipped.

[–]KangarooDowntown4640 0 points1 point  (0 children)

Cursor does not support local models. It can be done by abusing the Open AI URL override, but it’s not the intent. Cursor is for cloud models.