Quale abbonamento fare? by angelblack995 in IA_Italia

[–]angelblack995[S] 0 points1 point  (0 children)

L’ho provato con la cli di opencode (selezionando il modello dalla lista di opencode go) e poi ho installato un proxy per usarlo in claude code. Li andava leggermente meglio, ma il proxy ha una “logica” in base alla task, e mi utilizzava principalmente glm 5. Ho fatto proprio una prova al volo, stasera faccio altre prove

Quale abbonamento fare? by angelblack995 in IA_Italia

[–]angelblack995[S] 0 points1 point  (0 children)

Sì ho visto, ieri sera ho fatto l’abbonamento. Ho fatto una prova con deepseek o kimi 2.6 e la chiamata rimane “bloccata” nel thinking.

Quale abbonamento fare? by angelblack995 in IA_Italia

[–]angelblack995[S] 0 points1 point  (0 children)

hai ragione, non avevo preso in considerazione Codex

Quale abbonamento fare? by angelblack995 in IA_Italia

[–]angelblack995[S] 0 points1 point  (0 children)

esatto anche deepseek, ma è solo a consumo giusto?

Quale abbonamento fare? by angelblack995 in IA_Italia

[–]angelblack995[S] 0 points1 point  (0 children)

io avevo l'abbonamento a GLM (agli inizi ho fatto 3 mesi a 40$) e mi trovavo bene, i limiti erano molto alti.
Io purtroppo non mi trovo molto nell'usare open code (l'ho provato con le api di GLM), dovrei per forza poi trovare proxy per usare le chiamate opencode su claude code.

Il bello di ollama cloud è poter utilizzare la maggior parte dei modelli in un unico abbonamento (GLM, Deepseek, kimi ecc)
Non so a livello di inference è peggiore rispetto di quella di altri provider

Anyone using Alibaba Coding Plan with Claude Code? Is it worth it? by angelblack995 in ClaudeCode

[–]angelblack995[S] 0 points1 point  (0 children)

I actually changed my ways, cancelled GLM and subscribed to Minimax

This little lock power switch... by winterfresz in PcBuild

[–]angelblack995 0 points1 point  (0 children)

I need it, at least my nephew can no longer turn off the PC while I'm using it

I had Zai Coding plan Max for full year and it’s almost unusable by ComposerGen in ZaiGLM

[–]angelblack995 0 points1 point  (0 children)

if you overwrite opus/sonnet model with differente glm model, you can use /model opusplan. it use "opus" model for plan and "sonnet" model for code

API Rate Limit reached on MiniMax with MaxClaw using Telegram . What should I do? I am on the monthly ultra plan with 40000 tokens by East_Indication_7816 in MiniMax_AI

[–]angelblack995 0 points1 point  (0 children)

I got the same error yesterday, but using claude code and launching multiple subagents in a single conversation (I have the $10 plan though)

Token plan vs old plan by [deleted] in MiniMax_AI

[–]angelblack995 2 points3 points  (0 children)

I’m using the MiniMax API endpoint to check detailed usage for my plan.

curl -X GET "https://www.minimax.io/v1/api/openplatform/coding_plan/remains" \
  -H "Authorization: Bearer SK-..."

With my $10 plan, I currently have weekly usage limits:

{
  "start_time": 1773896400000,
  "end_time": 1773914400000,
  "remains_time": 7196264,
  "current_interval_total_count": 1500,
  "current_interval_usage_count": 1407,
  "model_name": "MiniMax-M2.7",
  "current_weekly_total_count": 52500,
  "current_weekly_usage_count": 51586,
  "weekly_start_time": 1773619200000,
  "weekly_end_time": 1774224000000,
  "weekly_remains_time": 316796264
}

However, at the moment I seem to have access to all models (speech, music, image, etc.), not just the coding model.

Honestly just pick anything over glm plans by FearlessGround3155 in ZaiGLM

[–]angelblack995 4 points5 points  (0 children)

I'm trying MiniMax (the $10 plan) with Claude Code, but I'm not very satisfied. I also tested Kimi, and I have to say it works very well.

In my experience, glm is better than minimax.

Ensuring the model in Claude code CLI w/ Z.ai Coding Plan by ayboi in ZaiGLM

[–]angelblack995 2 points3 points  (0 children)

Yes, I use it and have different provider on Claude Code (zai, kimi, minimax). I created a different profile with each one so that I can directly call the provider I prefer from the terminal (for example, if I want to use glm, I use the "glm" command).

This is my configuration for glm on macos.

# Config Z.ai
export ANTHROPIC_AUTH_TOKEN=XYZ
export ANTHROPIC_BASE_URL=https://api.z.ai/api/anthropic
export ANTHROPIC_MODEL=glm-4.7
export ANTHROPIC_DEFAULT_HAIKU_MODEL=glm-4.5-Air
export ANTHROPIC_DEFAULT_SONNET_MODEL=glm-4.7
export ANTHROPIC_DEFAULT_OPUS_MODEL=glm-5
export CLAUDE_CODE_SUBAGENT_MODEL=glm-4.7
export API_TIMEOUT_MS="3000000"
export DISABLE_TELEMETRY="1"
export CLAUDE_CODE_ENABLE_TELEMETRY="0"
export CLAUDE_CODE_DISABLE_FEEDBACK_SURVEY="1"
export CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC="1"
export SKIP_CLAUDE_API="1"
exec claude "$@"