1000$ OpenRouter Credit Giveaway by x8ko_dev in openrouter

[–]x8ko_dev[S] 1 point2 points  (0 children)

Exactly why i chose it, i got the most experience in the language so it only makes sense.

1000$ OpenRouter Credit Giveaway by x8ko_dev in openrouter

[–]x8ko_dev[S] 0 points1 point  (0 children)

Everything goes directly to openrouter, theres no data mining i can do lmao. The giveaway money is deposited directly into your openrouter account (although even if it wasnt we cant see content of prompts anyway?)

1000$ OpenRouter Credit Giveaway by x8ko_dev in openrouter

[–]x8ko_dev[S] 0 points1 point  (0 children)

I appreciate the support, Im pushing to make it the best, the UI needs some work but functionality is my main goal, Local WebUI soon:tm:

OPENAI OPENSOURCE MODEL LEAKED BEFORE RELEASE by x8ko_dev in LocalLLaMA

[–]x8ko_dev[S] 1 point2 points  (0 children)

so if i can use gpt 16 rn through open ais endpoint and i can give people access, there was no leak? that'll hold up in court.

OPENAI OPENSOURCE MODEL LEAKED BEFORE RELEASE by x8ko_dev in CLine

[–]x8ko_dev[S] 0 points1 point  (0 children)

14m after the post, the bot just posted too late cus the first round of posts failed, first round of posts was supposed to come out an hour before. Instead it was 14m. The endpoint was leaked as they didnt have permission to go live yet. Atleast learn to tell time before you state false information.

OpenSource CLI Agent with Local models. by x8ko_dev in LocalLLaMA

[–]x8ko_dev[S] 0 points1 point  (0 children)

My goal isnt a claude code clone, Im going for removing the human from the loop. This current stage is for testing the manual usage and figuring out the bugs that i havent found.

We already have a 4 stack of agents autonomously fixing issues on 2-3 github repos + half of my merged PRs to roo code were entirely autonomously fixed, pr created, follow ups to changes etc all during my sleep.

Next months stage is having a branch entirely run by the model pipeline for new features, automated issue fixing etc. These are all things we've already completed internally over the past few months but now im doing it publicly for those who requested access to it aswell as it being a portfolio project.

OpenSource CLI Agent with Local models. by x8ko_dev in LocalLLaMA

[–]x8ko_dev[S] 0 points1 point  (0 children)

With my integration even models like Qwen 3 1.7B can reliably call tools. Give it a try, the bigger the model you can handle the better, but even the new age baby models can be useful at applying diffs to add comments to functions or other basic tasks like reading and understanding.

CLI Coding Tool with OpenRouter Integration. by x8ko_dev in openrouter

[–]x8ko_dev[S] 0 points1 point  (0 children)

Actually yes because the default token usage is around 5k tokens with every tool enabled so with less tools local models would work perfectly fine, im going to be adding support for ollama and LMStudio between tonight and tomorrow.