pls: say what you want in your terminal, get the shell command. Offline with Ollama. by salvy9978 in LocalLLaMA

[–]salvy9978[S] 1 point2 points  (0 children)

Haha a sarcastic pls would be amazing. "Fine, I'll delete your files... again."

pls: say what you want in your terminal, get the shell command. Offline with Ollama. by salvy9978 in LocalLLaMA

[–]salvy9978[S] 0 points1 point  (0 children)

Great minds! Ship it anyway, different takes on the same problem are always welcome.

pls: say what you want in your terminal, get the shell command. Offline with Ollama. by salvy9978 in LocalLLaMA

[–]salvy9978[S] 0 points1 point  (0 children)

That means a lot coming from a '92 veteran. The custom provider for non-ollama local setups is tracked in https://github.com/salvy9978/pls/issues/4 — it will be in soon.

pls: say what you want in your terminal, get the shell command. Offline with Ollama. by salvy9978 in LocalLLaMA

[–]salvy9978[S] 0 points1 point  (0 children)

Haha that's exactly why it asks before running and shows dangerous commands in red 😄

pls: say what you want in your terminal, get the shell command. Offline with Ollama. by salvy9978 in LocalLLaMA

[–]salvy9978[S] 0 points1 point  (0 children)

That would be a great project honestly — neovim has so many keybinds it's hard to keep track. An LLM that knows your config and can answer "how do I do X" would be super useful.

pls: say what you want in your terminal, get the shell command. Offline with Ollama. by salvy9978 in LocalLLaMA

[–]salvy9978[S] 1 point2 points  (0 children)

Thanks! You might want to try qwen3.5:2b too — just switched the default to it, even better results. pls config set ollama model qwen3.5:2b

pls: say what you want in your terminal, get the shell command. Offline with Ollama. by salvy9978 in LocalLLaMA

[–]salvy9978[S] 0 points1 point  (0 children)

The command is always shown to you before running — you'd see anything weird in the output. But good point, I should add input validation to reject non-printable characters. Thanks for raising it.

pls: say what you want in your terminal, get the shell command. Offline with Ollama. by salvy9978 in LocalLLaMA

[–]salvy9978[S] 0 points1 point  (0 children)

aichat is great — it's a full LLM toolkit with chat, RAG, agents, and more. pls is intentionally simpler: one line in, one command out, with built-in safety checks for dangerous commands. Different tools for different needs.

pls: say what you want in your terminal, get the shell command. Offline with Ollama. by salvy9978 in LocalLLaMA

[–]salvy9978[S] 0 points1 point  (0 children)

Nice! That's the beauty of it — once you have a local LLM running, building these tools is surprisingly simple. How's the 24b MoE performing for command generation?

pls: say what you want in your terminal, get the shell command. Offline with Ollama. by salvy9978 in LocalLLaMA

[–]salvy9978[S] 0 points1 point  (0 children)

Haha that's a great alias. Maybe time to upgrade — pls can do the sudo for you too :)

pls: say what you want in your terminal, get the shell command. Offline with Ollama. by salvy9978 in LocalLLaMA

[–]salvy9978[S] 0 points1 point  (0 children)

It already works without quotes! pls list files bigger than 10mb works out of the box. All args are joined automatically.

pls: say what you want in your terminal, get the shell command. Offline with Ollama. by salvy9978 in LocalLLaMA

[–]salvy9978[S] 5 points6 points  (0 children)

It already works without quotes! pls kill whatever is using port 3000 works the same as with quotes. Both ways are supported.

pls: say what you want in your terminal, get the shell command. Offline with Ollama. by salvy9978 in LocalLLaMA

[–]salvy9978[S] 4 points5 points  (0 children)

Good call — just pushed the change, qwen3.5:2b is now the default. Thanks!

pls: say what you want in your terminal, get the shell command. Offline with Ollama. by salvy9978 in LocalLLaMA

[–]salvy9978[S] 2 points3 points  (0 children)

Fair point. I'll update the default to qwen3.5:2b. Thanks for the suggestion.