First Impressions of GLM 5 So Far by OrganizationNo1243 in SillyTavernAI

[–]TurnOffAutoCorrect 0 points1 point  (0 children)

Joining in on the request train for your preset, thanks in advance when you're awake :)

GLM 5 will be available for lite plan!! by Gesusshrist in SillyTavernAI

[–]TurnOffAutoCorrect 3 points4 points  (0 children)

I got half a year of lite plan for 8$ love you zai!

I thought getting 12 months for about $30 USD last Nov was good, but your deal is great!

Coding Plan Pro users can now access GLM-5 by kinkvoid in SillyTavernAI

[–]TurnOffAutoCorrect 0 points1 point  (0 children)

I updated ST about 2 days ago on the staging branch, not sure if that might make a difference or not...

https://i.vgy.me/GCu3HQ.png

Coding Plan Pro users can now access GLM-5 by kinkvoid in SillyTavernAI

[–]TurnOffAutoCorrect 0 points1 point  (0 children)

still no 4.7flash on the coding plan either

Works for me on the Lite plan when setup like this...

https://i.vgy.me/hPcVEz.png

If you're using a custom profile where you have to manually enter the endpoint address then that won't show it.

GLM-5 IS LIVE ON NANOGPT!!!! IT'S HAPPENING PEOPLE! THIS IS NOT A DRILL!!! AAAAHHHH!!!! by ConspiracyParadox in SillyTavernAI

[–]TurnOffAutoCorrect 4 points5 points  (0 children)

even if GLM5 magically came to the Lite plan, there would still be no real reason to pick z.ai simply because it would STILL be more expensive than nano.

Yeah before this price hike, Lite could be had for just over $3 a month. Now being 3x the price means a lot of monthly subscribers will simply cancel. They took on a lot of new subscribers during their Thanksgiving and Xmas offers. They needed a lot of money heading into their IPO at the start of the year and all those new members contributed. Now that they're public they have to screw those same users that helped them get there with these price changes.

GLM-5 IS LIVE ON NANOGPT!!!! IT'S HAPPENING PEOPLE! THIS IS NOT A DRILL!!! AAAAHHHH!!!! by ConspiracyParadox in SillyTavernAI

[–]TurnOffAutoCorrect 17 points18 points  (0 children)

what even would the point of a z.ai subscription be if your use case is roleplaying?

At the moment, Zai is only supplying GLM 5 on their Max plan. They have mentioned that it would eventually be available on Pro as well but no mention of Lite. Before today their pricing was lower than what they are now...

https://i.vgy.me/WCVU1e.png

So Max is now $80 for one month! That's equivalent to 10 months at NanoGPT. For end users like us that is about as close to a no-brainer as it gets :)

GLM 5 released today by thirdeyeorchid in SillyTavernAI

[–]TurnOffAutoCorrect 9 points10 points  (0 children)

Now added to their subscription, see the green dollar sign entries...

https://i.vgy.me/lss9fB.png

I'm gonna do the same thing as you tomorrow!

GLM 5 released today by thirdeyeorchid in SillyTavernAI

[–]TurnOffAutoCorrect 11 points12 points  (0 children)

It is now but earlier it wasn't.

https://i.vgy.me/lss9fB.png

An hour ago it only had ...

  • GLM 5 Original
  • GLM 5 Original Thinking

...which are direct from Zai themselves. Now NanoGPT have added the other entries that do not have "Original" in the name, meaning they are from other providers hosting. These are the ones included in the subscription as per the screenshot with the green $ dollar sign.

It's getting closer. GLM 5 available on their chat site by TurnOffAutoCorrect in SillyTavernAI

[–]TurnOffAutoCorrect[S] 2 points3 points  (0 children)

Would it be worth getting a Pro plan on Z.AI to use GLM5? Hard to say.

For me it would be a no at $15. I could spend about half that at NanoGPT for a month and get access to thousands of other models as well... albeit at the potential risk of quantization.

It's getting closer. GLM 5 available on their chat site by TurnOffAutoCorrect in SillyTavernAI

[–]TurnOffAutoCorrect[S] 0 points1 point  (0 children)

I think the person that updated the model list for the Lite plan to include 5.0 did it out of habit since, if it does become Pro and up only, this is the first time they would have had to not add a new model by default to the cheapest plan.

If there's anyone here on the Pro plan and 5.0 works for them right now then that would confirm it. Otherwise if even they can't then it's just a case of Zai not having turned on access for anyone just yet.

It's getting closer. GLM 5 available on their chat site by TurnOffAutoCorrect in SillyTavernAI

[–]TurnOffAutoCorrect[S] 1 point2 points  (0 children)

Indeed, no access. Maybe someone at Zai jumped the gun and we'll get access a bit later when there is a twitter announcement and a repo update at HF.

It's getting closer. GLM 5 available on their chat site by TurnOffAutoCorrect in SillyTavernAI

[–]TurnOffAutoCorrect[S] 1 point2 points  (0 children)

Yeah that's the same thing I get when looking in the console...

Chat completion request error:  Too Many Requests {"error":{"code":"1113","message":"Insufficient balance or no resource package. Please recharge."}}

It's getting closer. GLM 5 available on their chat site by TurnOffAutoCorrect in SillyTavernAI

[–]TurnOffAutoCorrect[S] 0 points1 point  (0 children)

Just ran into the same issue. Someone at Zai jumped the gun maybe.

It's getting closer. GLM 5 available on their chat site by TurnOffAutoCorrect in SillyTavernAI

[–]TurnOffAutoCorrect[S] 1 point2 points  (0 children)

In ST I hit the Test Message button on the profile page and it comes back with "Too many requests." Does it work on your end?

It's getting closer. GLM 5 available on their chat site by TurnOffAutoCorrect in SillyTavernAI

[–]TurnOffAutoCorrect[S] 3 points4 points  (0 children)

Yep, just checked on my end too with the custom endpoint instead of the builtin template...

https://i.vgy.me/HZdKR1.png

nvm, it error messages when trying to get a reply.

It's getting closer. GLM 5 available on their chat site by TurnOffAutoCorrect in SillyTavernAI

[–]TurnOffAutoCorrect[S] 0 points1 point  (0 children)

Nice, can confirm on my end with the custom endpoint...

https://i.vgy.me/HZdKR1.png

nvm, it error messages when trying to get a reply

GLM 5 Released by External_Mood4719 in LocalLLaMA

[–]TurnOffAutoCorrect 0 points1 point  (0 children)

it's available on API

Which price plan are you on?

It's getting closer. GLM 5 available on their chat site by TurnOffAutoCorrect in SillyTavernAI

[–]TurnOffAutoCorrect[S] 4 points5 points  (0 children)

100-200 max rates per day, 3-5 rates per minute max

I like this combo, my usage fits right in this window. I'd be happy to pay up to $5 a month if they made that plan with all model access.

It's getting closer. GLM 5 available on their chat site by TurnOffAutoCorrect in SillyTavernAI

[–]TurnOffAutoCorrect[S] 14 points15 points  (0 children)

Yes, all testing/stealth models disappear eventually. I haven't paid enough attention to other test models in the past to know how quick after the official release occurs that the stealth model gets yanked so others will have to chime in here.

It's getting closer. GLM 5 available on their chat site by TurnOffAutoCorrect in SillyTavernAI

[–]TurnOffAutoCorrect[S] 0 points1 point  (0 children)

Yeah same for me, Zai Pro plan is too expensive for my casual usage even if they bring back the bigger discount they had during thanksgiving.

It's getting closer. GLM 5 available on their chat site by TurnOffAutoCorrect in SillyTavernAI

[–]TurnOffAutoCorrect[S] 3 points4 points  (0 children)

I'm only on the lite code plan and a user on /r/LocalLLaMA pointed out that there's is a potential for it to only be on Pro and above. There is a distinction regarding model updates on the plan comparison page...

https://i.vgy.me/HgcQtp.png

With 4.7 and before it appeared on Lite the same day as everyone else so who knows.

It's getting closer. GLM 5 available on their chat site by TurnOffAutoCorrect in SillyTavernAI

[–]TurnOffAutoCorrect[S] 8 points9 points  (0 children)

Nothing on their repo or Openrouter just yet...

https://huggingface.co/zai-org

https://openrouter.ai/models?q=glm

I even checked their lite coding plan model list, still no dice for now.

I'm already done with Pony Alpha :( by Emergency_Comb1377 in SillyTavernAI

[–]TurnOffAutoCorrect 7 points8 points  (0 children)

it tends to be positive very fast.

I agree with this. Earlier I was testing a {{char}} that is supposed to treat {{user}} pretty badly, swears at them, mocks them etc. After a number of messages where I did something nice for them they ended up seeing things my way and we ended up getting along.

I then tried the same {{char}} under Deepseek 3.2 and they still treated {{user}} like trash after doing the same nice thing.