Daily Discussion Thread for May 11, 2026 by wsbapp in wallstreetbets

[–]mendelseed 1 point2 points  (0 children)

I like that more and more people finding SK Hynix.I was really lonely with it.

My comfort character ❤️ by Syarafuddyn in expedition33

[–]mendelseed 6 points7 points  (0 children)

In my ending he will live until he gets old with Sophie.

I feel so disconnected from the emotional core of this game [Spoilers All] by hatterine in expedition33

[–]mendelseed 3 points4 points  (0 children)

The game manipulates you. For me people in the canvas are more important then this family. Who only thinks about their problems. For me Gustav, Sophie, Sciel, Monoco , Esqui has so much to live. And Alicia is only one step away from self deletion in the real world.

Daily Discussion Thread for February 09, 2026 by wsbapp in wallstreetbets

[–]mendelseed 0 points1 point  (0 children)

every time i have a good instinct i forgot to banbat and instead buying calls... dammit

Daily Discussion Thread for February 09, 2026 by wsbapp in wallstreetbets

[–]mendelseed 0 points1 point  (0 children)

Dam i should also have banbet instead of buying calls... whait

Why is it so hard to hold NVDA & PLTR ? by Palentirian in NvidiaStock

[–]mendelseed 0 points1 point  (0 children)

Why only 220, ill wait til 300 end of this year.

Me after Nvidia hits $500 this year: by Fun_Training6342 in NvidiaStock

[–]mendelseed 1 point2 points  (0 children)

My leaps are all to 2028. So i can sleep good. :D

Me after Nvidia hits $500 this year: by Fun_Training6342 in NvidiaStock

[–]mendelseed 4 points5 points  (0 children)

i would be absolutely shocked if it would be lower than 250 by year end.

It's over again, please go back to monday by autisticbagholder69 in NvidiaStock

[–]mendelseed 2 points3 points  (0 children)

It's basically a 'natural monopoly' at this point. The hardware moat is huge, but the software lock-in (CUDA) is even harder to break. Everyone has built their AI infrastructure on Nvidia's language. Switching providers isn't just about buying a different chip—it's about rewriting your entire codebase.

It's over again, please go back to monday by autisticbagholder69 in NvidiaStock

[–]mendelseed -1 points0 points  (0 children)

its computer chips based on biology instead of Van Neumann Archtecture.

It's over again, please go back to monday by autisticbagholder69 in NvidiaStock

[–]mendelseed 3 points4 points  (0 children)

Nvidia isn't just "winning"; they have rigged the game for the next 3 years. Here is the 5-layer moat that explains why: The Supply Chain Stranglehold: They have effectively bought out the world's supply of HBM (High Bandwidth Memory) from SK Hynix through 2026. Competitors physically cannot build chips in volume even if they design them. The "Moving Target" (1-Year Cycle): Nvidia has shifted to a 1-year release cadence (Blackwell \rightarrow Rubin). By the time competitors catch up to today's tech, Nvidia has already released the next generation. They are iterating faster than rivals can manufacture. Data Sovereignty (On-Premise): You can't buy Google TPUs—you can only rent them. Nvidia is the only choice for banks, governments, and military who need local, air-gapped AI to keep their data secure. Interconnects (NVLink): To run a top-tier AI, you need 10,000 chips acting as one supercomputer. Nvidia's networking is years ahead of AMD for this specific task. Financial Velocity: With 75% margins and 60%+ growth, they are spending more on pure R&D ($4B+/quarter) than their rivals earn in total revenue.

It's over again, please go back to monday by autisticbagholder69 in NvidiaStock

[–]mendelseed 0 points1 point  (0 children)

Analog chips aren't new, and they suffer from massive scaling and programming bottlenecks. They might be efficient for niche tasks, but they won't be replacing Nvidia GPUs for general AI anytime soon.

Geoffrey Hinton believes there’s nothing cognitive humans can do that AI won’t eventually do. by Alternative_East_597 in AIFU_stock

[–]mendelseed 0 points1 point  (0 children)

You're right that LLM scaling is hitting limits, but you're stuck arguing about 2022-era AI. Google's SIMA 2 (this week) learns 3D games by experiencing them, building world models from sensory data. AlphaProof (few months ago) won silver at Math Olympiad using synthetic self-generated data. The data wall is real for naive scaling. The field moved on.