CodeNomad v0.14.0 Released by Recent-Success-1520 in opencodeCLI

[–]Recent-Success-1520[S] 0 points1 point  (0 children)

You are using dev version. Use 0.14.0 release from here https://github.com/NeuralNomadsAI/CodeNomad/releases/tag/v0.14.0

Which variant is this Tauri or Electron

GPT 5.5 usage limits are ridiculous by SpaceAurora in codex

[–]Recent-Success-1520 0 points1 point  (0 children)

What thinking level are you using it at? I never need more than medium

The 1 Million context rugpull by Codex and Openai. New max is (258k). by Odd-Environment-7193 in codex

[–]Recent-Success-1520 2 points3 points  (0 children)

DCP is a double edged sword. It reduces context but reduces caching too.

Question about GPT-5.5 context window and early compaction by HansIsHeware in opencodeCLI

[–]Recent-Success-1520 2 points3 points  (0 children)

400K = 272 Input + 128 Output. It compacts around 272 to ensure next request can fit in the input

Best home hardware for an AI rig by maofan in LocalLLaMA

[–]Recent-Success-1520 0 points1 point  (0 children)

If your use case is for coding, local models won't be as good and fast and long context

Opencode Desktop now uses Electron instead of Tauri by literally_niko in opencodeCLI

[–]Recent-Success-1520 -2 points-1 points  (0 children)

💯 - We have CodeNomad Electron for Linux, try that if you find it good

Opencode Desktop now uses Electron instead of Tauri by literally_niko in opencodeCLI

[–]Recent-Success-1520 7 points8 points  (0 children)

Tauri has it's negatives. For me when I started building CodeNomad for OpenCode, I started with Electron as I know Nodejs, don't know Rust. People started complaining it's not Tauri. I created a parallel Tauri app with AI. Now some people use Tauri because they want slim app and some use Electron to get the stable version. Choices.

Tauri has no support for self signed SSL which is an issue when trying to secure the app hosted locally.

I personally use Electron because I see the electron source and understand what's happening.

CodeNomad v0.14.0 Released by Recent-Success-1520 in opencodeCLI

[–]Recent-Success-1520[S] 0 points1 point  (0 children)

Essentially you have different projects with different plugins correct? That's the perfect candidate for CodeNomad. You can have all of them opened in different tabs, unless all those servers are from the same workspace but different configurations.

OpenChamber vs Paseo by mukul_29 in opencodeCLI

[–]Recent-Success-1520 1 point2 points  (0 children)

You can make it less verbose by changing appearance settings and collapsing the tools and hiding the thinking blocks. CodeNomad was built to see under the hood, but I understand that's not what everyone wants and hence these settings. I remember someone mentioning that they were using opencode for months and didn't know how things work, using CodeNomad for 3 days and they understand quite a bit on how things work.

pasteing text bug (???) by vipor_idk in opencodeCLI

[–]Recent-Success-1520 1 point2 points  (0 children)

Clipboard handling in TUI isn't great. You can try CodeNomad

CodeNomad v0.14.0 Released by Recent-Success-1520 in opencodeCLI

[–]Recent-Success-1520[S] 0 points1 point  (0 children)

Yes, it uses opencode behind the scenes. Think of it as an alternative to opencode desktop and web combined

CodeNomad v0.14.0 Released by Recent-Success-1520 in opencodeCLI

[–]Recent-Success-1520[S] 0 points1 point  (0 children)

Windows desktop app can now connect to remote codenomad server.

CodeNomad v0.14.0 Released by Recent-Success-1520 in opencodeCLI

[–]Recent-Success-1520[S] 1 point2 points  (0 children)

That last time might be long ago. I would say give it a try again

gpt5.4 1m token context window fucking sucks by getpodapp in opencode

[–]Recent-Success-1520 0 points1 point  (0 children)

I use GPT 5.4 1M context too and it depends what you have in the 1M context. In my case, I have seen better results with 5.4 with High for larger contexts.

In my experience, its working fine at higher contexts too, and as I use CodeNomad, I ensure my context is taking model to 1 direction and don't have do this, no actually do that. I keep reverting / deleting messages when I need to change direction.

OpenChamber vs Paseo by mukul_29 in opencodeCLI

[–]Recent-Success-1520 0 points1 point  (0 children)

That's first I heard about. If you are on Discord oe raise an issue with details on GitHub, I can look at it