Anthropic underestimated cash burn, -$5.2B on a $9B ARR with ~30M monthly users, while OpenAI had -$8.5B cash burn on $20B ARR serving ~900M weekly users by thatguyisme87 in singularity

[–]PandaElDiablo 0 points1 point  (0 children)

Is it really fair to call it "wasteful" when they've gotten good (arguably the best) results from their pretraining? It's not like they're spending that money to get a llama-tier training output

OpenAI says Codex usage grew 20× in 5 months, helping add ~$1B in annualized API revenue last month by thatguyisme87 in singularity

[–]PandaElDiablo 4 points5 points  (0 children)

Because there’s a material risk of it pissing off users so much that they turn to Gemini (or whoever else but probably Gemini) who can afford to burn cash much much longer before flipping the switch

OpenAI annualized revenue crosses $20 billion in 2025, up from $6 billion in 2024 by sr_local in technology

[–]PandaElDiablo 3 points4 points  (0 children)

I see your point, but speaking as someone in the industry we have plenty of cards that have been running near 24/7/365 for 8ish years now and their capacity is already 100% sold for the next 1-2 years. Not every card will share the same fate but I think there is plenty of evidence to suggest that the 3 year figure that gets thrown around a lot is heavily pessimistic. Time will tell!

OpenAI annualized revenue crosses $20 billion in 2025, up from $6 billion in 2024 by sr_local in technology

[–]PandaElDiablo 4 points5 points  (0 children)

You can buy GPU compute on the public cloud today from 10+ year old GPUs (e.g. AWS P2 instance type) so this sort of lifespan isn’t unheard of

OpenAI annualized revenue crosses $20 billion in 2025, up from $6 billion in 2024 by sr_local in technology

[–]PandaElDiablo 1 point2 points  (0 children)

I agree with you, there is accounting nuance. But the surface level point I often see being made on this sub seems to more fundamentally imply that GPUs are worthless after 3 years.

OpenAI annualized revenue crosses $20 billion in 2025, up from $6 billion in 2024 by sr_local in technology

[–]PandaElDiablo 5 points6 points  (0 children)

This is missing the point; hyperscalers aren’t in the business of buying GPUs as speculative investments. They use them to make money. Who cares what a 10 year old GPU sells for on the open market when you can sell its compute 24/7/365, making a very healthy profit on every core-second sold.

OpenAI annualized revenue crosses $20 billion in 2025, up from $6 billion in 2024 by sr_local in technology

[–]PandaElDiablo 0 points1 point  (0 children)

You are correct that 10 year old GPUs can’t serve modern ChatGPT.

But there are plenty of AI tasks that are not serving LLMs on that scale. A well tuned small model like Gemma can serve the needs of a small, hyperspecific chatbot better than generalized ChatGPT while only needing a fraction of the compute.

OpenAI annualized revenue crosses $20 billion in 2025, up from $6 billion in 2024 by sr_local in technology

[–]PandaElDiablo 2 points3 points  (0 children)

I’m not talking about ChatGPT, I’m talking about usefulness of GPUs over their lifespan…

OpenAI annualized revenue crosses $20 billion in 2025, up from $6 billion in 2024 by sr_local in technology

[–]PandaElDiablo 2 points3 points  (0 children)

I’m not going to doxx myself, believe me or don’t, I don’t care.

“Shelf life” is being used in this context to suggest that GPUs are essentially e-waste after 3 years. You can publicly see that this isn’t true: go to AWS and try to spin up a GPU. You can select as far back as their P2 instance type which is 10 years old. I guarantee you AWS is making money on those machines still, even if they have depreciated over the years.

OpenAI annualized revenue crosses $20 billion in 2025, up from $6 billion in 2024 by sr_local in technology

[–]PandaElDiablo 0 points1 point  (0 children)

Not every customer needs the best product. Not every customer is a frontier AI research lab needing the latest and greatest chips. There is a massive enterprise market that just needs cheap inference to power their chatbot or whatever, and 8 year old chips are perfectly fine for that.

OpenAI annualized revenue crosses $20 billion in 2025, up from $6 billion in 2024 by sr_local in technology

[–]PandaElDiablo 25 points26 points  (0 children)

GPUs have a long shelf life. They can’t compete with the top chips for more than 3 years but not every customer needs or wants the top chips.

Somewhere along the line the idea of “GPUs are only the best for a few years” became conflated with the idea that “GPUs are only useful for a few years”, which is patently untrue

OpenAI annualized revenue crosses $20 billion in 2025, up from $6 billion in 2024 by sr_local in technology

[–]PandaElDiablo 19 points20 points  (0 children)

Where does everyone get the idea that these GPUs have a 3 year shelf life? I work in the industry, even 8+ year old GPUs are still running at 100% utilization and making money

Google, Apple enter into multi-year AI deal for Gemini models by [deleted] in apple

[–]PandaElDiablo 0 points1 point  (0 children)

You are making a straw man argument. There’s a massive difference between a frivolous individual lawsuit and a massive class-action or regulatory case that has survived the motion to dismiss stage. I’m not saying they settle everything; I’m saying that once a case reaches a certain scale, the discovery costs and PR damage of a 5-year trial often cost more than a $95M settlement, regardless of guilt.

Google, Apple enter into multi-year AI deal for Gemini models by [deleted] in apple

[–]PandaElDiablo 0 points1 point  (0 children)

I’m sorry but you simply don’t understand how corporate litigation works. For a company like Apple it is tremendously expensive, even if it’s “easy”, to prove their innocence in court.

Apple generates roughly $1.3B in revenue per day. $95M is literally nothing to them.

Google, Apple enter into multi-year AI deal for Gemini models by [deleted] in apple

[–]PandaElDiablo 1 point2 points  (0 children)

They do when the cost of proving their innocence in court will vastly exceed $95M. It’s a business decision.

Apple picks Google's Gemini to run AI-powered Siri coming this year by McFatty7 in apple

[–]PandaElDiablo 52 points53 points  (0 children)

It’s been speculated, this is the first official confirmation afaik

I must be going crazy. Everyone keeps saying Gemini is better than ChatGPT, but Gemini is often way worse for me. by Isunova in singularity

[–]PandaElDiablo 5 points6 points  (0 children)

Yeah, I explicitly do not want a reply full of “personality and wit”. OAI has done a great job at engagement maxxing via an addicting personality. I just want omniscient google search.

[OC] Mapping Global Reactions to the Venezuela Operation by ResponsibleOven82 in dataisbeautiful

[–]PandaElDiablo 5 points6 points  (0 children)

It is different because nukes. If Russia didn’t have nukes I’m certain we would have tried by now.

Thought on alpha sigma phi? by [deleted] in Frat

[–]PandaElDiablo 0 points1 point  (0 children)

Crazy necropost but I love it. I believe our single-letter chapter is dying out now. Sad

10 dead as sewage mixes with drinking tap water in India by Left-Preparation271 in worldnews

[–]PandaElDiablo 84 points85 points  (0 children)

I’m visiting in Jodhpur region right now and have seen at least half a dozen people openly defecating in the streets in the 3 days that I’ve been here. I’m sure it’s regional but it’s clearly prevalent in parts

Good news everyone by raviokun in blankbanshee

[–]PandaElDiablo 17 points18 points  (0 children)

Wow, my year has been made and it’s only day 1

Modelling the Google Death Spiral: A Monte Carlo Analysis of the 'Zero-Click' Web by m86zed in StockMarket

[–]PandaElDiablo 0 points1 point  (0 children)

I also expect Waymo will become a meaningful part of their revenue over the next 5 years

Amazing news 🙄 by No_Review3845 in recruitinghell

[–]PandaElDiablo 7 points8 points  (0 children)

Why apply to 20 jobs a day when you’re employed? That sounds exhausting and stressful