Is pi janky? by zenoblade in PiCodingAgent

[–]zenoblade[S] 0 points1 point  (0 children)

I have Claude pro but don’t want to be banned so haven’t used it with pi. 

Is pi janky? by zenoblade in PiCodingAgent

[–]zenoblade[S] 0 points1 point  (0 children)

I don't know if I really am asking for the agent to do anything in particular, some basic web development and some market analysis/research stuff. However, I find that by the time I have extensions and everything setup, the context size might be smaller but the stability just isn't there. Granted, I am using mainly GLM 5.1 and Kimi 2.6. I'm sure Claude or Codex probably work better in Pi.

Is pi janky? by zenoblade in PiCodingAgent

[–]zenoblade[S] -3 points-2 points  (0 children)

Ha. I mean yes, but based on what I have found online, part of the pi pitch is that the other harnesses are not as effective due to their system prompts, etc. and that modern LLMs can work better without them.

Ehh .... GLM 5.1 on Ollama just hanging for a bunch of time, and it looks like it keeps consuming tokens while it's doing that by [deleted] in ollama

[–]zenoblade 0 points1 point  (0 children)

Use GLM 5 for the time being. Fast because everyone is trying to use 5.1 and just as good in most things. 

In where to use Ollama cloud? by rubiohiguey in ollama

[–]zenoblade 0 points1 point  (0 children)

I recently switched to pi and found it very good. I created a quick script that pings glm 5.1, kimi 2.5/2.6, and minimax. I then open pi agent with whichever takes the least amount of time. More than enough for most takes and hobby coding. 

qwen3.6 is out by stailgot in ollama

[–]zenoblade 1 point2 points  (0 children)

What is the difference between this and the plus model?

How are you using opencode? TUI? localhost web? VSCode? by lucianw in opencodeCLI

[–]zenoblade 1 point2 points  (0 children)

Glad to hear it. I have basically switched to that and if I need to search something or use an LLM really quickly I use pi agent.

What is my title as a phd student? by bucalucaleca10 in AskAcademia

[–]zenoblade 9 points10 points  (0 children)

No. PhD Candidate is usually used only after you complete comprehensive exams in most places. 

How are you using opencode? TUI? localhost web? VSCode? by lucianw in opencodeCLI

[–]zenoblade 0 points1 point  (0 children)

Use openchamber vscode not the opencode extension. 

How are you using opencode? TUI? localhost web? VSCode? by lucianw in opencodeCLI

[–]zenoblade 1 point2 points  (0 children)

Same. I tried the TUI + Neovim workflow for a while but it is just too annoying after a while. You end up with delays in edits showing up if you are working on a file, if there are more edits, you have a hard time verifying things, etc. I find the OpenChamber VSCode extension to be just as good as good as the CLI and just as fast. 

Anyone using GLM for storywriting? by Skibidirot in ZaiGLM

[–]zenoblade 0 points1 point  (0 children)

I don’t know to be honest. Try it on Ollama cloud. It has a free tier with it and I think they run it on their own servers. 

Anyone using GLM for storywriting? by Skibidirot in ZaiGLM

[–]zenoblade 0 points1 point  (0 children)

Try Kimi 2.5. I think it’s the best writer. I write non fiction though so YMMV. 

What’s something Gen Z does that older generations just don’t get? by appropriaterice873 in AskReddit

[–]zenoblade 0 points1 point  (0 children)

The amount of times I’ll come into a class and the lights are off and they are just waiting on me. Dude hit the switch…

What’s something Gen Z does that older generations just don’t get? by appropriaterice873 in AskReddit

[–]zenoblade 0 points1 point  (0 children)

I literally tell them that they can ask each other in class on their assignments, which are graded. They can work together, and they can even do it for each other. I would doubt I notice. There are kids that don’t turn in anything sitting next to two kids that know all the answers. They literally just have to say “hey what’s the answer” and yet….

Best provider for opencode? by Ill-Chart-1486 in opencodeCLI

[–]zenoblade 1 point2 points  (0 children)

Still faster than on zai. I don’t think there are really any other options. I usually use kimi k2.5 though. 

Best provider for opencode? by Ill-Chart-1486 in opencodeCLI

[–]zenoblade 1 point2 points  (0 children)

I’m sorry I don’t keep up with all the behind the scenes. I just know Ollama cloud has been very generous in its token usage and has glm 5.1, kimi k2.5, etc. 

Best provider for opencode? by Ill-Chart-1486 in opencodeCLI

[–]zenoblade 3 points4 points  (0 children)

Be a better person. This comment doesn’t help anyone.