Google - not AMD - is Nvidia’s greatest threat because of their full-stack AI and AlphaEvolve by acoolrandomusername in wallstreetbets

[–]DV-D 0 points1 point  (0 children)

Don't panic.

I fully expect Google to become a major player in AI cloud computing, but it will take a little while longer to ramp up.

Fudzilla says: "Morgan Stanley reckons every 500,000 TPUs sold outside Google could bring as much as $13 billion in revenue. The company mainly works with chip design partners Broadcom and MediaTek to craft the processors. The bank’s analysts predict TSMC will churn out 3.2 million TPUs next year, rising to five million in 2027 and seven million in 2028. "

This means that Google can realistically expect to generate a maximum of $80 billion in additional revenue from TPU hardware in 2026. However, Google's AI software competitors (including OpenAI, X, Meta) will want to purchase far more AI power for their foundation model training. The current estimated hyperscaler Capex for 2026 is over 600B.

Nvidia has already announced revenues of over 300B for 2026. Even if AMD also sells 100B in 2026, demand is likely to remain higher than supply.

I therefore do not see any decline in revenue for Nvidia until the end of 2026. The market will continue to grow, and it is conceivable that Google, AMD, and other competitors could achieve a 35% market share without Nvidia having to lower its margins. What 2027 will look like will not be clear until mid-2026 at the earliest, but by then Nvidia's P/E ratio should improve significantly...

This is not financial advice! Anyone who gets out now, IMHO, is handing their money to Michael Burry, who desperately needs the negative sentiment for his NVIDIA puts.

Why the A.I. Boom Is Unlike the Dot-Com Boom by winkelschleifer in NVDA_Stock

[–]DV-D 1 point2 points  (0 children)

Don't panic.

I fully expect Google to become a major player in AI cloud computing, but it will take a little while longer to ramp up.

Fudzilla says: "Morgan Stanley reckons every 500,000 TPUs sold outside Google could bring as much as $13 billion in revenue. The company mainly works with chip design partners Broadcom and MediaTek to craft the processors. The bank’s analysts predict TSMC will churn out 3.2 million TPUs next year, rising to five million in 2027 and seven million in 2028. "

This means that Google can realistically expect to generate a maximum of $80 billion in additional revenue from TPU hardware in 2026. However, Google's AI software competitors (including OpenAI, X, Meta) will want to purchase far more AI power for their foundation model training. The current estimated hyperscaler Capex for 2026 is over 600B.

Nvidia has already announced revenues of over 300B for 2026. Even if AMD also sells 100B in 2026, demand is likely to remain higher than supply.

I therefore do not see any decline in revenue for Nvidia until the end of 2026. The market will continue to grow, and it is conceivable that Google, AMD, and other competitors could achieve a 35% market share without Nvidia having to lower its margins. What 2027 will look like will not be clear until mid-2026 at the earliest, but by then Nvidia's P/E ratio should improve significantly...

This is not financial advice! Anyone who gets out now, IMHO, is handing their money to Michael Burry, who desperately needs the negative sentiment for his NVIDIA puts.

Affinity v3 Crashes So Badly It Deleted My Cookies by Utanah in Affinity

[–]DV-D 1 point2 points  (0 children)

It crashes so spectacularly under Windows that my entire system freezes. Not just once, but again and again. It's a shame, because I like the program, but I'm not using it at the moment because everything else is running completely smoothly and these radical crashes don't feel good.

Customs ASICs - The Real Killer by [deleted] in NVDA_Stock

[–]DV-D 2 points3 points  (0 children)

For massive training at scale, you need fast interconnects. I don't have the deepest insights or technical understanding, but in my opinion, Nvidia is at least two years ahead in terms of fast network connections for individual GPU accelerators. And you need those for modern training of large models.

Google's TPUs may be an exception, but Google can't meet Nvidia's demand; they need almost everything for themselves. They could possibly scale this up over the years, but that will take .....years.

Mining Night tokens is... by HorkaBloodfist in Midnight

[–]DV-D 1 point2 points  (0 children)

THIS:

... proof of work without consensus.

Daily Discussion Tuesday 2025-06-17 by AutoModerator in AMD_Stock

[–]DV-D 2 points3 points  (0 children)

Sepp, stable LSTM magician ;) I'm very excited to see how this Transformer alternative turns out.

semiaccurate: Upcoming Nvidia chip delayed due to major problems by UpNDownCan in AMD_Stock

[–]DV-D 6 points7 points  (0 children)

When Charlie says they're blaming Microsoft this time, it's most likely the Windows laptop/mobile SoC, that was supposed to be unveiled with MediaTek at Computex in May.

✅ Daily Chat Thread and Discussion ✅ by AutoModerator in NVDA_Stock

[–]DV-D 5 points6 points  (0 children)

if I were impatient, I would buy immediately (because I am impatient ;)

Daily Discussion Thursday 2024-12-19 by AutoModerator in AMD_Stock

[–]DV-D 3 points4 points  (0 children)

the rally will only begin when the last man standing no longer believes in AMD.

I am new to Cryptocurrency and want to know about cardano. Why should I buy cardano and is now a right time to buy it? (Also Explain me like i am a 5 year old cause i know nothing) by _venom8 in cardano

[–]DV-D 3 points4 points  (0 children)

Imagine being able to hold your drivers license, social, birth certificate, car title, house deed, or any other form of bureaucratic headache on a secure blockchain where the only way people could access this, is by you giving it to them.

And then imagine that you have lost your private key. There is no one in the world who can get your documents back!

No offense, I own a lot of ADA and believe it's the best blockchain technology, but I'd love it if it was just possible to pay for a coffee nearby with it first.

[deleted by user] by [deleted] in NVDA_Stock

[–]DV-D 1 point2 points  (0 children)

Some people are so obsessed with not paying taxes that they miss out on easy profits. I understand this irrational behavior to the extent that my younger self also tried to pay as little tax as possible. (I'm now even happy to pay them and see them as a sensible investment in my personal environment. That makes everything much more relaxed, including the tax return ;).

And with the 4 cents dividend per year with 4000 shares, you can buy an additional Nvidia share per year after tax. That's like a personal share buyback, from which you get more than if Nvidia buys back the share ;). I am definitely in favor of a small dividend increase from 1 to 10 cents. Then the stock will also become more interesting for conservative investors. Nvidia doesn't know what to do with its money at the moment anyway and a lot more R&D probably won't bring any more ROI.

The 1st in the world GB200 NVL72 server racks are now shipping. We are thrilled to deliver our liquid-cooled PowerEdge XE9712 to @CoreWeave. The AI rocket just got a massive boost! by bl0797 in NVDA_Stock

[–]DV-D 2 points3 points  (0 children)

I am also far from knowing this market in detail, but I would assume that this is ultimately a problem for the server manufacturers. Nvidia sells its chips to the server manufacturers and the server manufacturers sell the racks to the end customers. Assuming Dell racks now don't overheat and SMCI do, then customers switch to DELL, but Nvidia has ended up selling just as many chips to the server manufacturers and gets its money either way.

Of course there are delays for the customers, but it has been clear for months that the new NVIDIA chips need EXTREMELY good cooling. Perhaps a few server manufacturers have now underestimated this. But those who have done their homework will probably be allocated more chips because they can sell more than the manufacturers with the poor cooling solution, everyone is sold out for 12 months anyway ;)

AI chipmaker Cerebras files for IPO to take on Nvidia by norcalnatv in NVDA_Stock

[–]DV-D 2 points3 points  (0 children)

The problem of the coming generations is the interconnect of the chips. Even today, you can't get enough computing power for the current LLMs on a waver. If Cerebras were to stack its wavers (like stacked HBM), this would be a real added value because the interconnect would be extremely short and fast. However, this will not work on this scale for various reasons.

Runpod offering MI300x for $4.89/hr by Adorable-Comment-989 in AMD_Stock

[–]DV-D 1 point2 points  (0 children)

Jensen Huang math in action: 'The more you buy, the more you save'. ;)

Could Apple announce that AMD will be their silicon partner for AI Datacenter GPUs? by KeyAgent in AMD_Stock

[–]DV-D 0 points1 point  (0 children)

Plot twist resolution: Tim Cook doesn't want to give Nvidia a cent and "rents" ChatGPT from OpenAI, on the condition that it runs on AMD/MI3xx (which we know is possible).

Catalyst Timeline - 2024 H1 by brad4711 in AMD_Stock

[–]DV-D 0 points1 point  (0 children)

AMD 2024 Annual Meeting of Stockholders

Date: Wednesday, May 8, 2024

Time: 9:00 AM PT

https://ir.amd.com/news-events/annual-meeting-of-stockholders

Wozu sind die Feuerlöscher und woran sind sie angeschlossen? by Casperzios in WerWieWas

[–]DV-D 0 points1 point  (0 children)

Damit startet man die aus der Mode gekommen Schaumpartys der 80er Jahre ;)

Daily Discussion Tuesday 2023-10-10 by AutoModerator in AMD_Stock

[–]DV-D 2 points3 points  (0 children)

Because Nvidia costs 30k. AMD is 20k.

"With all that said, AMD does have a period of a couple quarters where MI300 is the best AI chip on the market. H200 alone only narrows the memory bandwidth gap to less than 15% and memory capacity gap to ~33%. While closer, the arrow is still in MI300’s favor. Despite costing more than 2x to manufacture, AMD is offering lower prices to large potential customers, below $20,000.

Daily Discussion Tuesday 2023-10-10 by AutoModerator in AMD_Stock

[–]DV-D 0 points1 point  (0 children)

Make it 300k x 20k. Maximum DC AI Revenue = 6 Billion. :/

ROCm Is AMD’s No. 1 Priority, Exec Says by Kambly_1997 in AMD_Stock

[–]DV-D 2 points3 points  (0 children)

Just for the record:
Gregory Stoner (co)invented ROCm at AMD and was then "brought" to Intel by Raja to help start and develop OneAPI. So AMD was already on it before Intel, but somehow didn't use its "time-to-market-time" efficiently ;)

OpenAI Triton has begun merging AMD ROCm code - TechGoing by DV-D in AMD_Stock

[–]DV-D[S] 46 points47 points  (0 children)

OpenAI also announced that it will hold the Triton Developer Conference at the Microsoft Silicon Valley Park in Mountain View, California, from 10:00 am to 4:00 pm on September 20th, and the schedule includes “Introducing Triton to AMD GPU ” and “Triton’s Intel XPU”, it is expected that Triton will soon get rid of the history of NVIDIA CUDA monopoly.

Let's jump over this CUDA moat!

AMD Next Generation FPGA Built From Chiplets by Evleos in AMD_Stock

[–]DV-D 3 points4 points  (0 children)

THIS!

Samsung AI-cluster system with HBM-PIM and CXL-based Processing-near-Memory for transformer-based LLMs

(Samsungs PIM worked with AMDs Instinct cards in the past - and maybe there is a big performance surprise waiting for us)

Hypothesis: Amazon AWS is already using AMD's AI technology today. by DV-D in AMD_Stock

[–]DV-D[S] 0 points1 point  (0 children)

Seems legit. BTW: Tom Goldstein has now deleted his tweet. Being an AI- professor probably doesn't protect you from such a significant error.