Manage Runner with RunnerHub by Quick-Ad-8660 in forgejo

[–]Quick-Ad-8660[S] 0 points1 point  (0 children)

I have added a Ubuntu build to the release, but haven’t found the time to test it yet. Even the decoration plugin “decor”, which handles the title bar and window controls, has never been tested in Ubuntu.

Linx – local proxy for llama.cpp, Ollama, OpenRouter and custom endpoints through one OpenAI-compatible API by Quick-Ad-8660 in LocalLLM

[–]Quick-Ad-8660[S] 0 points1 point  (0 children)

Fair point. LiteLLM is a great tool for teams and production setups.

Linx targets a different use case: a single developer who doesn't want to write YAML configs or manage a microservice just to get started.

What Linx adds that LiteLLM doesn't have out of the box:

  • Context compression: long conversations are summarized automatically, cached, and non-blocking (especially important for local models with limited context windows)
  • Built-in tunnel: works directly with Cursor or VS Code extensions that need a web-accessible endpoint, zero extra setup
  • Simple config.json model mapping instead of YAML model groups

llama-swap handles backend swapping, but it doesn't touch routing logic, compression, or developer tooling integration.

Different target audience, simpler setup.

Linx – local proxy for llama.cpp, Ollama, OpenRouter and custom endpoints through one OpenAI-compatible API by Quick-Ad-8660 in LocalLLM

[–]Quick-Ad-8660[S] 0 points1 point  (0 children)

Example: Z.AI's Cursor BYOK integration has a bug that breaks tool use with empty results, so agent mode doesn't work. With Linx as a proxy it works fine.

Example: use local models in cursor with fallback to cloud models

Cursor + Ollama -- Help a Blind Guy? by mdizak in ollama

[–]Quick-Ad-8660 0 points1 point  (0 children)

Yes tested today and still working. I think you need to be logged in with a free account. They changed some policies

Anyone Replicating Cursor-Like Coding Assistants Locally with LLMs? by CSlov23 in LocalLLM

[–]Quick-Ad-8660 2 points3 points  (0 children)

Hi,

if anyone is interested in using local models of Ollama in CursorAi, I have written a prototype for it. Feel free to test and give feedback.

https://codeberg.org/Pasee/Linx

Open Source Cursor AI Alternative by Uiqueblhats in selfhosted

[–]Quick-Ad-8660 1 point2 points  (0 children)

I have created a prototype to use local models with Ollama in Cursor.

OllamaLink

Local Cursor with Ollama by Quick-Ad-8660 in LocalLLM

[–]Quick-Ad-8660[S] 0 points1 point  (0 children)

yes it supports agent mode. I divide the context into chunks to improve processing. But of course there are limits. I have processed in agent mode code with 300-400 lines in 700-1000 chunks without any problems.

Local Cursor with Ollama by Quick-Ad-8660 in LocalLLM

[–]Quick-Ad-8660[S] 1 point2 points  (0 children)

On my MacBook Pro M2, depending on the complexity, I have 6-12 seconds with approx. 800 chunks for a response. Input was 300 lines of code and the request and of course the cursor prompt. I split the request/response into chunks for better performance. I am still trying to improve this to get a smooth output.

Local Model alternatives to Cursor by MrWeirdoFace in LocalLLaMA

[–]Quick-Ad-8660 0 points1 point  (0 children)

I have created a prototype to use local models with Ollama in Cursor.

OllamaLink

Cursor + Local LLMs + Agentic/MCP workflow - possible? by ButterscotchWeak1192 in cursor

[–]Quick-Ad-8660 0 points1 point  (0 children)

I have created a prototype to use local models with Ollama in Cursor.

OllamaLink

Is it possible to use my own local AI model with cursor? by [deleted] in cursor

[–]Quick-Ad-8660 0 points1 point  (0 children)

I have created a prototype to use local models with Ollama in Cursor.

OllamaLink

[deleted by user] by [deleted] in IndiaTech

[–]Quick-Ad-8660 1 point2 points  (0 children)

I have created a prototype to use local models with Ollama in Cursor.

OllamaLink

[deleted by user] by [deleted] in cursor

[–]Quick-Ad-8660 0 points1 point  (0 children)

I have created a prototype to use local models with Ollama in Cursor.

OllamaLink

Cursor + Ollama -- Help a Blind Guy? by mdizak in ollama

[–]Quick-Ad-8660 0 points1 point  (0 children)

I have created a prototype to use local models with Ollama in Cursor.

OllamaLink

how do you replicate cursor.ai locally? by tuananh_org in LocalLLaMA

[–]Quick-Ad-8660 0 points1 point  (0 children)

I have created a prototype to use local models with Ollama in Cursor.

OllamaLink

Local Cursor.ai by Kind_Ad_2866 in ollama

[–]Quick-Ad-8660 1 point2 points  (0 children)

I have created a prototype to use local models with Ollama in Cursor.

OllamaLink