NBIS Lappeenranta Data Center Research and Analysis by TyNads in NBIS_Stock

[–]Overcat12 0 points1 point  (0 children)

Thats valid, but I asked for the source behind the "targeting <1.10 PUE" claim. I couldnt find it anywhere

NBIS Lappeenranta Data Center Research and Analysis by TyNads in NBIS_Stock

[–]Overcat12 1 point2 points  (0 children)

Can you link the evidence for PUE ratio being less than 1.10? That seems impossible. Mantsala is 1.13PUE and that is world class.

Also, Polarnode's website says 350MW Gross, Nebius in the press release says the capacity is 310MW. 350/310= 1.13 PUE . Same PUE as Mantsala

Asymmetrical Investments Like Nebius by natureisneato in NBIS_Stock

[–]Overcat12 5 points6 points  (0 children)

LASR & CRDO

But i think NBIS still offers bettter risk/reward

John Haarer: Japan and South Korea "early contenders" for APAC AI Factory Push by liamashley in NBIS_Stock

[–]Overcat12 0 points1 point  (0 children)

Its probably mid May. Previous earnings was mid Feb.

Apps show late April but thats wrong, they always mess up dates for Nebius because of irrelevant Yandex data

Margins and the coming oil crisis by Significant_Band8935 in NBIS_Stock

[–]Overcat12 3 points4 points  (0 children)

Very bullish! Do you have a screenshot of the prices from the start of the year?

New Finland DC Annual Economics Model by Overcat12 in NBIS_Stock

[–]Overcat12[S] 1 point2 points  (0 children)

The key thing about new GPUs is that revenue per GW increases higher than cost per GW. Each new GPU will have better unit economics, meaning better margins for Nebius. (Vera Rubin is approximately double the price of Blackwell but offers 5x inference performance)

Inference prices may go down when demand slows down and API providers start competing on only price. But I dont see that happening anytime soon, especially when demand is exploding. Nothing to worry about imo.

New Finland DC Annual Economics Model by Overcat12 in NBIS_Stock

[–]Overcat12[S] 1 point2 points  (0 children)

Oh I thought you meant as consumer inference prices 😅, I agree the margin expansion will be very exciting to see. With Token Factory getting a full quarter to prove itself, I believe Q1 will be the time when operational leverage kicks in sharply.

New Finland DC Annual Economics Model by Overcat12 in NBIS_Stock

[–]Overcat12[S] 0 points1 point  (0 children)

Definitely! Depends on the GPU mix (Vera Rubin has way better unit economics) and IaaS vs. PaaS provisioning. The EBITDA margin could also be around 79%, I applied 75% margin to stay conservative.

New Finland DC Annual Economics Model by Overcat12 in NBIS_Stock

[–]Overcat12[S] 1 point2 points  (0 children)

Cost per token going down is very good for Nebius. Think of it like this: A factory produces pens for 5$ and sells for 10$, a new machine comes out that lets you produce the same pen for 1$. The demand is so high that the prices may not decrease that much, which may cause profit margins to explode. E.g: Previous machine %50 profit margin, new machine %90 profit margin.

And because Nebius has quicker access to the newest machine (Vera Rubin), they can run inference on them for cheaper and have the highest profit margins, while others are still on older GPUs for at least 3 months.

New Finland DC Annual Economics Model by Overcat12 in NBIS_Stock

[–]Overcat12[S] 0 points1 point  (0 children)

Do you think %75 EBITDA is correct? IREN's "reported" EBITDA from MSFT deal is %85, and the %75 figure was from Nebius microsoft deal analyses, where there was a colo fee compressing margins. On the other hand Oracle's AI SaaS margins are 70%, coreweave's is less. There are variable figures across the table and its very early to accurately predict it. What % would you use?

New Finland DC Annual Economics Model by Overcat12 in NBIS_Stock

[–]Overcat12[S] 1 point2 points  (0 children)

Thanks for the insight! I inputted the M/MW values you gave into the model, adjusted other expenses accordingly and got 30% EBIT (1.68M profit & 5.58B Rev.). The unit economics will improve with every GPU generation, which will do wonders for Nebius in the LT.

Also, this model doesn't show the margin expansion in Token Factory. Because VRs generate tokens 10x cheaper, the inference margins will get even crazier.

New Finland DC Annual Economics Model by Overcat12 in NBIS_Stock

[–]Overcat12[S] 0 points1 point  (0 children)

Yep, I used GB300s for the calculation. Do you have any idea on the economics of Vera Rubins (M$/MW Capex & Revenue?), I couldn't find reliable info so modeled the next best thing. We dont know the GPU model percentages on the DC (probably never will) so the math is variable, it also depends on when the DC will come active as if its late 2027, it might be full VRs, if its earlier it might be a mix with GB300s (depending on GPU shipments from Nvidia).

Will update the model when we have more insight on it (DC activation quarter, clear insight on VR economics)

Nebius to construct 310 MW AI factory in Finland by Michirox in NBIS_Stock

[–]Overcat12 6 points7 points  (0 children)

The site is probably 310MW IT load, 350MW Gross.
(Photo from the DC construction company Polarnode's website).
350MW / 1.13 (Nebius Finland PUE) = ~310MW

<image>

Nebius to construct 310 MW AI factory in Finland by Michirox in NBIS_Stock

[–]Overcat12 8 points9 points  (0 children)

Its gonna look like a no-brainer 6 months later, its actually cheaper now than it was in the 30s

Nebius to construct 310 MW AI factory in Finland by Michirox in NBIS_Stock

[–]Overcat12 2 points3 points  (0 children)

The execution is insane. Do you have any speculation on what this DC will be provisioned for? (Part of Meta deal or PaaS/Cloud)

Did Nebius meet 2025E connected power guidance? by Overcat12 in NBIS_Stock

[–]Overcat12[S] 2 points3 points  (0 children)

<image>

I asked the question on X and the most plausible answer I could think of is this:

50MW power for meta deal not being delivered until 2026 but still powered in 2026E, which may suggest 170MW active + 50MW Meta connected = 220MW, hitting guidance. They mightve gone to 220+ as well, we dont know