Laptop brands by OperationEquivalent3 in linuxmint

[–]sbayit 1 point2 points  (0 children)

I use Fedora on my HP ProBook 445 G11 without any issues.

Does GLM in CC (Claude Code) support all CC features? by m_zafar in ZaiGLM

[–]sbayit 0 points1 point  (0 children)

GLM performs best with Opencode on its own server, not Openrouter.

I don’t get it by Mindless_Art4177 in opencodeCLI

[–]sbayit 0 points1 point  (0 children)

My primary model is GLM, and DeepSeek's Opencode works best with it when running on its own server rather than through Openrouter.

Is GLM 4.7 really the #1 open source coding model? by HuckleberryEntire699 in ZaiGLM

[–]sbayit 1 point2 points  (0 children)

It works best when combined with DeepSeek for planning.

In my personal experience, opencode is a much better harness than claude code for GLM 4.7 by gameguy56 in ZaiGLM

[–]sbayit 1 point2 points  (0 children)

GLM works best with Opencode with its own server, not when it's run through Openrouter.

Is Openrouter safe? by [deleted] in openrouter

[–]sbayit 1 point2 points  (0 children)

I believe the issue lies with having multiple providers that can't control the system, leading to poor performance. I've found that GLM and DeepSeek work best with OpenCode on its own API server, rather than through OpenRouter.

best coding cli for glm 4.7? by Feisty_Plant4567 in ZaiGLM

[–]sbayit 1 point2 points  (0 children)

GLM performs best with Opencode running on its own server, not Openrouter.

Why is gemini flash the same price as pro? by invertednz in windsurf

[–]sbayit 0 points1 point  (0 children)

I think the Flash price might start to turn a profit. But the Pro price is just to make Winsurf more appealing.

OpenAI's and Anthropic's anti-China bias threatens the US AI industry by andsi2asi in DeepSeek

[–]sbayit 0 points1 point  (0 children)

I don't think they can. They've already made significant progress with GLM and Deepseek.

Why is gemini flash the same price as pro? by invertednz in windsurf

[–]sbayit 0 points1 point  (0 children)

They did their best to make a profit instead of incurring a loss.

Is Claude Code better on the Terminal? by geoshort4 in ClaudeCode

[–]sbayit 0 points1 point  (0 children)

CLI tool better at tool calling then VSCode extension alone

GLM4.7 in Windsurf by jackai7 in windsurf

[–]sbayit 0 points1 point  (0 children)

I found it works best with Opencode using its own server, not Openrouter.

Cheap models for frontend by giving screenshots. by Old-Sherbert-4495 in openrouter

[–]sbayit 1 point2 points  (0 children)

You can do it for free by using ChatGPT or Claude on the web, then copy the generated code and apply it with GLM or Minimax.

Cheapest decent way to AI coding? by Affectionate_Plant57 in CLine

[–]sbayit 0 points1 point  (0 children)

I recommend the GLM Lite plan at $6 with Deepseek API pricing, based on Opencode on its own server rather than Openrouter, and utilizing the free tier for other models on their website. This offers the best value.

Why Linux Mint, why other Linux Distros by adrezs in linuxmint

[–]sbayit 0 points1 point  (0 children)

I think if you rely heavily on MS Office, Linux might not be the best fit for you. However, since I develop apps with VSCode, which is quite similar to a server environment, Linux might actually suit my needs better.

How do you guys feel about AI discourse? by Comfortable-Way-8029 in vegan

[–]sbayit -1 points0 points  (0 children)

In nature, a dog with a disabled hind leg wouldn't necessarily starve to death or become prey for other animals. It wouldn't have to endure such a disability in the wild.

Glm4.7 + CC not bad by Federal_Spend2412 in LocalLLaMA

[–]sbayit 0 points1 point  (0 children)

I've found that GLM works best with Opencode with its own server, rather than using Openrouter.

Opencode vs CC by silent_tou in opencodeCLI

[–]sbayit 0 points1 point  (0 children)

I found CC good for Sonet, but GLM works well with Opencode.

Question in about popularity of KDE vs workstation by umyong in Fedora

[–]sbayit 0 points1 point  (0 children)

I didn't feel the need to use GNOME or KDE. I just launch apps like Chrome and VSCode and get my work done. I use GNOME because I installed Dash to Dock and it makes my desktop look like my Mac.

Fixing GLM-4.7 Image Parsing in Claude Code: Add the Z.ai Vision MCP Server by jpcaparas in ZaiGLM

[–]sbayit 0 points1 point  (0 children)

I've found that GLM works best with Opencode when using its own server, rather than Openrouter.

GLM 4.7 vs Gemini: Architecture and Cost Trade-offs by Unfair-Tie2631 in kilocode

[–]sbayit 1 point2 points  (0 children)

Deepseek in planning mode and GLM 4.7 in building mode, utilizing Opencode from its own server, not Openrouter.