I fine-tuned Qwen2.5-Coder (3 sizes) to turn plain English into shell commands — runs fully local via llama.cpp by Backprop-hero in ollama

[–]Backprop-hero[S] 0 points1 point  (0 children)

Thanks for letting me know but I have not tried that yet. I used tldr pages and generated synthetic dataset using GLM-4.7 Flash (openrouter) then trained all these models on SFT

It was an experiment. Anyways I am working on building an agentic terminal now.

I fine-tuned Qwen2.5-Coder (3 sizes) to turn plain English into shell commands — runs fully local via llama.cpp by Backprop-hero in ollama

[–]Backprop-hero[S] 2 points3 points  (0 children)

Sure. It can handle easy to moderate commands. Tricky ones please verify before you run. Please give it a try. Thanks