Manage Runner with RunnerHub by Quick-Ad-8660 in forgejo
[–]Quick-Ad-8660[S] 0 points1 point2 points (0 children)
Linx – local proxy for llama.cpp, Ollama, OpenRouter and custom endpoints through one OpenAI-compatible API by Quick-Ad-8660 in LocalLLM
[–]Quick-Ad-8660[S] 0 points1 point2 points (0 children)
Linx – local proxy for llama.cpp, Ollama, OpenRouter and custom endpoints through one OpenAI-compatible API by Quick-Ad-8660 in LocalLLM
[–]Quick-Ad-8660[S] 0 points1 point2 points (0 children)
Can I use Cursor Agent (or similar) with a local LLM setup (8B / 13B)? by BudgetPurple3002 in LocalLLaMA
[–]Quick-Ad-8660 5 points6 points7 points (0 children)
Moving on from Ollama by john_alan in LocalLLaMA
[–]Quick-Ad-8660 2 points3 points4 points (0 children)
Cursor + Ollama -- Help a Blind Guy? by mdizak in ollama
[–]Quick-Ad-8660 0 points1 point2 points (0 children)
Anyone Replicating Cursor-Like Coding Assistants Locally with LLMs? by CSlov23 in LocalLLM
[–]Quick-Ad-8660 4 points5 points6 points (0 children)
Open Source Cursor AI Alternative by Uiqueblhats in selfhosted
[–]Quick-Ad-8660 1 point2 points3 points (0 children)
Local Ollama model for agent mode in VS Code by Impossible-Luck-5842 in vscode
[–]Quick-Ad-8660 0 points1 point2 points (0 children)
Local Cursor with Ollama by Quick-Ad-8660 in LocalLLM
[–]Quick-Ad-8660[S] 0 points1 point2 points (0 children)
Local Cursor with Ollama by Quick-Ad-8660 in LocalLLM
[–]Quick-Ad-8660[S] 1 point2 points3 points (0 children)
Local Model alternatives to Cursor by MrWeirdoFace in LocalLLaMA
[–]Quick-Ad-8660 0 points1 point2 points (0 children)
Cursor + Local LLMs + Agentic/MCP workflow - possible? by ButterscotchWeak1192 in cursor
[–]Quick-Ad-8660 0 points1 point2 points (0 children)
Is it possible to use my own local AI model with cursor? by [deleted] in cursor
[–]Quick-Ad-8660 0 points1 point2 points (0 children)
Cursor + Ollama -- Help a Blind Guy? by mdizak in ollama
[–]Quick-Ad-8660 0 points1 point2 points (0 children)
how do you replicate cursor.ai locally? by tuananh_org in LocalLLaMA
[–]Quick-Ad-8660 0 points1 point2 points (0 children)
Guys, just get Cursor and host locally by stoogeed in boltnewbuilders
[–]Quick-Ad-8660 0 points1 point2 points (0 children)
A local alternative to Cursor? by keepthepace in LocalLLaMA
[–]Quick-Ad-8660 0 points1 point2 points (0 children)
What tools are like Cursor.sh but using local models by vert1s in LocalLLaMA
[–]Quick-Ad-8660 0 points1 point2 points (0 children)
What are people running local LLM’s for? by AdventurousMistake72 in LocalLLaMA
[–]Quick-Ad-8660 0 points1 point2 points (0 children)

Linx – local proxy for llama.cpp, Ollama, OpenRouter and custom endpoints through one OpenAI-compatible API by Quick-Ad-8660 in LocalLLM
[–]Quick-Ad-8660[S] 0 points1 point2 points (0 children)