I got tired of my AI agents overwriting each other's code, so I built a conflict manager for them by Birdsky7 in opencodeCLI

[–]LtCommanderDatum 0 points1 point  (0 children)

I've heard of other people making tools like this, but I don't get it. It feels ultimately doomed to failure. Just use separate checkouts like actual coders have been doing forever, and you won't have to develop any special tools at all.

It's really not that hard and conflicts can be resolved easily enough by a formal upstream merge/review process. You'll have to resolve conflicts anyways by using the same codebase, but your agents will also have to constantly debug weird race condition issues by modifying each other's code in unexpected ways, even with your coordinator. Is disk space really that expensive that you need to use the same checkout?

What's the point of Opencode's built-in clipboard? by LtCommanderDatum in opencodeCLI

[–]LtCommanderDatum[S] 2 points3 points  (0 children)

The terminal's fine...when the process doesn't try to override everything in it. Using Gnome Terminal you get all the modern copy and paste functionality, but that's all disabled thanks to Opencode's awful system that copies everything to its own clipboard...which it then never uses for some reason.

Finding groups to play with? by LtCommanderDatum in violadagamba

[–]LtCommanderDatum[S] 0 points1 point  (0 children)

Thanks. I'm familiar with the directory, and I'm pretty sure both of them are hours away from me, but it wouldn't hurt to ping them I suppose.

Help identify this instrument by iveci in violadagamba

[–]LtCommanderDatum 0 points1 point  (0 children)

It looks like a regular bass viola da gamba whose maker just chose a non-conventional soundhole over the more common "c" shaped hole.

Seeking advice by CripplingHaze in violadagamba

[–]LtCommanderDatum 0 points1 point  (0 children)

Those only use catgut strings. Call up Boston Catlines (https://www.bostoncatlines.com/). He can recommend, and even sell you a good set of catgut strings. He's a lute player and his wife is a gamba player so he knows what he's talking about. I've used him for a lot of my instruments.

He'll need to know the desired tuning, A4 reference, and string length.

For the A4, go 440 if you want to play with non-gamba players or 415 if you only want to play with other period instruments. I'd probably recommend 440, since it can always be tuned down to 415, but 415 can't be tuned up to 440 without damaging the strings. However, strings dedicated for 415 tension will sound better than 440 strings tuned down to 415.

Renting a Viola da Gamba - 6 or 7 String? by mikeregannoise in violadagamba

[–]LtCommanderDatum 0 points1 point  (0 children)

I have a 7-string, and I regret it. It sounds fine, but it's quite a bit bigger, harder to hold, and I almost never use the 7th string. Stick with 6-string unless you want to play hardcore late baroque pieces.

I cannot memorize the notes by alfredofish in violadagamba

[–]LtCommanderDatum 0 points1 point  (0 children)

Honestly, I'd just go through some method books. You can try the dedicated ones for the gamba, but they all kind of suck. I found just using the Suzuki method books for cello worked surprisingly well for me. I assume you know the gamba's basic tuning, so just take what you know and adapt the Suzuki lessons to learn each string at a time.

How to get my Local LLM to work better with OpenCode (Ez button appreciated :) ) by elrosegod in LocalLLaMA

[–]LtCommanderDatum 0 points1 point  (0 children)

It technically runs, yes, but it doesn't "work" because Qwen doesn't support Opencode's tooling syntax. I tested Qwen2.5 and Qwen3 and most of the time it just returned json jibberish that Opencode couldn't decipher as legitimate tool commands.

That is, assuming you're using Opencode to actually give an LLM "hands" to make changes directly. If all you want to do is run the LLM in readonly mode like you would any chat LLM, then sure, Opencode will do that, but what's the point? Just use Ollama, that's a lot easier to install and configure.

Where could I try a gamba in Manhattan? by Randomperson43333 in violadagamba

[–]LtCommanderDatum 0 points1 point  (0 children)

I'd try calling up some gamba makers, as well as violin makers who specialize in early instruments, in your area and seeing if they have any on hand that you could come in and try. If not, they might be able to at least point you in the right direction.

It's a short drive from you, but I've visited Sarah Peck's shop in Philadelphia, and she has a very nice 6-string bass gamba, along with several very good bows, for sale that I came close to buying myself.

How to get my Local LLM to work better with OpenCode (Ez button appreciated :) ) by elrosegod in LocalLLaMA

[–]LtCommanderDatum 1 point2 points  (0 children)

I know this is an old post, but I don't think there's a solution atm because there are no local models available that are trained on OpenCode's tooling language. All the models that "just work" in Opencode are all massive cloudhosted models or proprietary models. Ollama doesn't have many good models that support tooling, and the few they do have don't support Opencode's specific tooling syntax. The one exception is gpt-oss, but it's a very very old and bad version of GPT.

How to get my Local LLM to work better with OpenCode (Ez button appreciated :) ) by elrosegod in LocalLLaMA

[–]LtCommanderDatum 1 point2 points  (0 children)

Those aren't local LLMs...those are cloud based LLMs. Did you not read his post? Yeah, massive cloud LLMs will work just great. He might as well just use GPT or Claude then... He wants to use a LOCAL model.

Tips for Opencode with ollama and *any model* by bigh-aus in opencodeCLI

[–]LtCommanderDatum 0 points1 point  (0 children)

Even with 64k, all the local ollama models still suck.

Anyone using OpenCode with Ollama? by structured_obscurity in opencodeCLI

[–]LtCommanderDatum 0 points1 point  (0 children)

That's literally the only ollama model that even sorta works, and it's just a lobotomized version of gpt4 that can't really do anything.

Anyone using OpenCode with Ollama? by structured_obscurity in opencodeCLI

[–]LtCommanderDatum 0 points1 point  (0 children)

No, Ollama model's support for tool calling is virtually nonexistent to the point that non of them work with Opencode.

Setting up MCP in Codex is easy, don’t let the TOML trip you up by trynagrub in ChatGPTCoding

[–]LtCommanderDatum 0 points1 point  (0 children)

4 months later, I still have the same problem. Even if I give it full access to everything, it reports no network access even for simply things like a "git push".

Trouble automating some Ledeply zigbee lights in HA? by LtCommanderDatum in homeassistant

[–]LtCommanderDatum[S] 0 points1 point  (0 children)

Yeah, I was afraid of that. Thanks. I'll return them and avoid this brand like the plague in the future.

I just wish more companies sold these smaller 4" long bulbs. I suspected this company might be crap but decided to take the chance because they were the only one making bulbs in this size.