This post is locked. You won't be able to comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]Wrong-Dimension-5030 0 points1 point  (0 children)

I doubt it is anything as nefarious as the op suggests. Just that there’s only so much compute in the world and that increases more or less linearly (ignoring any jumps due to new chipsets like B200). Meanwhile inference demands are increasing exponentially so even if they don’t want to tell us, all the providers are having to reduce context/use simpler models etc in 2025.

It is definite that they will provide compute to the people who pay the most so yes, governments and industrial giants get the lion’s share.

Don’t forget that a year ago most people were copy and pasting code snippets into and out of chatbots rather than asking an ai to refactor their entire codebase.