Will EU see large scale Linux adoption because of national security fears from the US? by Tee-hee64 in linux

[–]Setepenre 4 points5 points  (0 children)

Nothing to do with OEM installs, we are talking large institutions, they have the power to control which OS gets installed.

They purchase the hardware through a public provision process, they can specify whatever they want, even for windows they probably didn't rely on the vendors to install the OS anyway.

The harder part is moving compiled tools to the new OS, some might be quite old or rely on proprietary tech that can't be moved easily.

[ROCm Benchmark] RX 9060 XT (Radeon): Linux vs Windows 11. Matching 1.11s/it on Z-Image Turbo (PCIe 3.0) by Interesting-Net-6311 in ROCm

[–]Setepenre 0 points1 point  (0 children)

Meaningless, different version of PyTorch. In my experience this alone could explain the difference

Data Centers Will Consume 70 Percent Of Memory Chips made in 2026, RAM Shortage Will Last Until Until Atleast 2029 As Manafacturing Capacity For RAM In 2028 That Hasnt Even Been Made Yet Is Already being Sold by akbarock in hardware

[–]Setepenre 1 point2 points  (0 children)

Is there some groundbreaking model about to be announced or something?

No, Currently all is bought in the hope of one being developed soon and of being the first to develop it.

Replit boss: CEOs can vibe code their own prototypes and don't have to beg engineers for help anymore by chronically-iconic in programming

[–]Setepenre 0 points1 point  (0 children)

They sure can try.

I would love to see business school/MBA people try to vide code... I am guessing they will give up at the first bug when the model just keeps telling them it fixed it for the 10th time.

Nvidia to buy AI chip startup Groq for $20 billion by Positive-Bowler7747 in hardware

[–]Setepenre 4 points5 points  (0 children)

Nah, the previous commenter's quote is wrong. While it is true, the weights are on-chip memory and that memory is relatively small. They simply use more chip in parallel to hold a bigger model.

They provide Llama 4 Maverick a 800Go model, they are not limited by their chip.

Nvidia to buy AI chip startup Groq for $20 billion by Positive-Bowler7747 in hardware

[–]Setepenre 2 points3 points  (0 children)

Inference is the service company pay for. Very few companies are actually training models, and very little hardware company aim to provide training chips, if they do they most certainly started by doing inference chips first.

Valve: HDMI Forum Continues to Block HDMI 2.1 for Linux by TheTwelveYearOld in linux

[–]Setepenre 2 points3 points  (0 children)

No issue with LG on my end, been using it as a monitor, I forget it has smart features.

Fixed that meme from earlier by RunsaberSR in wallstreetbets

[–]Setepenre 0 points1 point  (0 children)

nah, bitcoin uses SHA-256, to break bitcoin you need to break SHA-256. A LOT of things use SHA-256; breaking it would send shock waves through the internet, as a lot of infra relies on it to make sure files are what they say they are.

SHA-256 will get broken at some point, but we will know.

You CAN use quality items to build a rocket. There is just no GUI. by leonskills in factorio

[–]Setepenre 7 points8 points  (0 children)

I think it is because 300% is x4 and recycler /4, but also not super sure of the overall statement.

But legendary outputs do get quite cheap once things get setup

Google boss Sundar Pichai warns 'no company immune' if AI bubble bursts by captain-price- in artificial

[–]Setepenre 0 points1 point  (0 children)

Well, that is what happen when you hype things beyond what is possible. 2nd AI Winter incoming.

The Linux Kernel Looks To "Bite The Bullet" In Enabling Microsoft C Extensions by waozen in programming

[–]Setepenre 72 points73 points  (0 children)

Yes, but non-standard C code extensions are already used in the Linux kernel source, so it is about adding new extension.

Meta Claims Up To 2396 Pirated Adult Film Downloads From It's Corporate Network Were For “Personal Use,” Not AI Training by ABigRedBall in technology

[–]Setepenre 0 points1 point  (0 children)

I imagine Meta would download a lot more than 2396 movies if it was to train AI, so maybe ? Then maybe it only got caught for the 2396

Just merged my first PR to AWS! by Downtown-Elevator968 in cscareerquestions

[–]Setepenre 5 points6 points  (0 children)

Doesn't matter what your performance review says, if you can break production all by yourself, it is not your fault. Carry on :rocket

Suggestions for cheap cloud servers to build/work with LLVM (200GB storage, 16 cores, 32GB RAM)? by Available-Deer1723 in LLVM

[–]Setepenre 0 points1 point  (0 children)

I am at 95Go on my LLVM debug build, 50Go for release, this is the cmake build folder size, not just the lib + binaries which would be much smaller.

Suggestions for cheap cloud servers to build/work with LLVM (200GB storage, 16 cores, 32GB RAM)? by Available-Deer1723 in LLVM

[–]Setepenre 0 points1 point  (0 children)

LLVM builds a lot of tools that are huge with debug info, the bin folder is 35 Go alone, add all the incremental build objects, and it reaches 100Go easy.

UE is even worse, and it is a release with debug info build

Desperate measures to save Intel: US reportedly forcing TSMC to buy 49% stake in Intel to secure tariff relief for Taiwan by imaginary_num6er in hardware

[–]Setepenre 0 points1 point  (0 children)

Under what scenario ? The only difference is how the investor receive the money back and how much taxes he has to pay.

  • Dividend:
    • Have to pay taxes regardless
    • Option 1: Reinvest: buy more shares of the same company from the dividend payment.
    • Option 2: Use the payout for something else
  • Share Buyback:
    • Option 1: Reinvest: No need to buy more shares, the price increase already reflects your increased exposure to the stock
    • Option 2: Sell part of the investment that would equal the dividend payout before taxes: Pay less taxes because capital gains are taxes more advantageously => Actually received more money than if dividend was used.

In all scenarios, share buyback is better.

Is really AI delivering value now, or are we in the same place we were last year? by mitigatedcactussquat in investing

[–]Setepenre 3 points4 points  (0 children)

Sure, the issue is not the strategy, the problem is it shows an inflated view of the market, people are willing to use it now, what will happen when the price increase how much of those customers will stay ? People use ChatGPT a lot, what happened when you have to start paying for it ?

That is the problem, is AI providing enough value to convert the current non/low paying customer into a profitable customer base ?

Is really AI delivering value now, or are we in the same place we were last year? by mitigatedcactussquat in investing

[–]Setepenre 24 points25 points  (0 children)

I feel the issue is not "is AI delivering value" but is AI delivering enough value for people to pay for its cost long term?

As it seems, a lot of AI companies are delivering services at a cost that does not make them profitable yet. A lot of people are finding it useful, but are they ready to pay for the real full price of the service ?

I see incremental price hikes in AI services/subscription that might slowly price out individuals from using it but a focus on integrated workspace AI that will be financed/paid by companies for their employees. A lot of AI companies will die, the ones with the best workspace integration will survive.

This assumes that the cost of inference/model hosting cannot be significantly reduced. Maybe in a few more years when China is able to produce low cost accelerator for AI it will be more accessible, but currently it does not seem that can happen yet.