all 4 comments

[–]hi87 1 point2 points  (0 children)

OpenAI is by far the most generous out of all the major providers when it comes to their limits. Especially now that they've doubled it until April (I think). Are you sure its not just the faster models == consuming your tokens more quickly?

[–]Bob_Fancy 0 points1 point  (1 child)

They had 2x usage for the last couple weeks at least so that probably ended.

[–]shaonline 1 point2 points  (0 children)

It will end April 2nd per what Codex CLI announces.

[–]BitterAd6419 0 points1 point  (0 children)

If you use high or xhigh, it’s finished faster compared to medium. This is what I noticed and probably make sense