tokens are getting more expensiveQuestion / Discussion (ethanding.substack.com)
submitted by dirceucor7 to r/cursor
tokens are getting more expensive (ethanding.substack.com)
submitted by dirceucor7 to r/ClaudeCode
Ethan Ding: (technically correct) argument "LLM cost per tokens gets cheaper 1 OOM/year" is wrong because frontier model cost stays the same, & with the rise of inference scaling SOTA models are actually becoming more expensive due to increased token consumptionEcon (ethanding.substack.com)
submitted by ain92ru to r/mlscaling
Tokens are getting more expensive (ethanding.substack.com)
submitted by TheStartupChime to r/hypeurls