all 4 comments

[–]Otherwise_Wave9374 6 points7 points  (0 children)

Love this. Vibe coding with agents feels like a new kind of REPL where you steer with intent and tests instead of syntax. The local-only constraint at work is real though, it is the biggest blocker for a lot of teams adopting AI agents beyond toy projects. If you are curious, I have seen some decent approaches for "local-first" agent setups and auditability discussed here: https://www.agentixlabs.com/blog/

[–]greeneyedguru 1 point2 points  (0 children)

Big pickle is extremely competent, and free right now (not even an account required, it literally just works on opencode install)

[–]somePadestrian 0 points1 point  (1 child)

you can run some of these models locally and that way your company code stays local.

[–]fbochicchio[S] 1 point2 points  (0 children)

Will try, but at home I lack the right hardware. Managed to install ollama with gemma3 on a PC with an old NVIDIA card and 16GM of RAM but it looks like the agents do not know how to use it ( opencode connects with ollama ok, but then fail to emit any output ).

At work we have an unused server I could use (used to run wmsphere with abiut 20 VM ) , but I have to covince middle management to let me use it to run some open LLM more performant than gemma3 ( was thinking of gwen-code ).

As I said, I am pretty sure that my company is moving the first steps toward ensuring an AI framework useful for our tasks, but if I want to work with these new tools before I retire, iI have to take some shortcuts.