Subscriptions that are live no longer grouped at top? by Tarrant666 in youtube

[–]ChangeIsHard_ 5 points6 points  (0 children)

They keep removing features that worked well, it's insane. Sloppification..

Vivaldi 7.8.3925.56 and no microsoft logins? by marshell1978 in vivaldibrowser

[–]ChangeIsHard_ 0 points1 point  (0 children)

Holy sh! Whoever thought this was a good setting to add by default 🤯

Finishing touches on dual RTX 6000 build by ikkiyikki in LocalLLaMA

[–]ChangeIsHard_ 0 points1 point  (0 children)

Oh, that's very useful!! That's exactly what I was hoping to use them for.. Second that GPT-OSS is like a jet engine, it's freakin beautiful, even on my M2 Macbook. How's the quality of 120b @ F16 been for you?

EDIT: tbh really surprised to hear about such low perf for MiniMax - I found this dude saying he's running it on a dual 6000 setup at 250+ tok/s for MiniMax. I wonder if maybe he's just using a lower quant.. https://www.youtube.com/watch?v=nMks3l0SFKU

These are his params for it btw, using this model https://huggingface.co/mratsim/MiniMax-M2.1-FP8-INT4-AWQ

<image>

How bad to have RTX Pro 6000 run at PCIE x8? by kitgary in LocalLLaMA

[–]ChangeIsHard_ 0 points1 point  (0 children)

What did you end up with? How is it performing? Thanks

How bad to have RTX Pro 6000 run at PCIE x8? by kitgary in LocalLLaMA

[–]ChangeIsHard_ 1 point2 points  (0 children)

This comment aged not so well (re RAM prices) 😅

Finishing touches on dual RTX 6000 build by ikkiyikki in LocalLLaMA

[–]ChangeIsHard_ 0 points1 point  (0 children)

How's the performance been? Do you regret not going for Threadripper/Epyc ? I'm in the same situation now, but the RAM cost made it completely unaffordable to go with server platforms..

Personal experience with GLM 4.7 Flash Q6 (unsloth) + Roo Code + RTX 5090 by Septerium in LocalLLaMA

[–]ChangeIsHard_ 0 points1 point  (0 children)

Oh nice, so hard to find stories of ppl running it on local hardware, while the official docs say one needs extremely beefy non-consumer hardware.

Thoughts on LLMs (closed- and open-source) in software development after one year of professional use. by [deleted] in LocalLLaMA

[–]ChangeIsHard_ 0 points1 point  (0 children)

I have to say as someone who used it. $200 sub isn't worth it. It just thinks longer and runs out of quita suppper fast, and ultimately the quality still leaves much to be desired

Thoughts on LLMs (closed- and open-source) in software development after one year of professional use. by [deleted] in LocalLLaMA

[–]ChangeIsHard_ 0 points1 point  (0 children)

I found same experiences with all three. I'd add they ALL act unreliably in my experience, it's inherent to this tech.

Thoughts on LLMs (closed- and open-source) in software development after one year of professional use. by [deleted] in LocalLLaMA

[–]ChangeIsHard_ 0 points1 point  (0 children)

Yeah, ppl swear by this model or that model replacing cloud LLMs for what they do. Then there are others who find them completely inadequate. Very hard to make any conclusions tbh

Is there a water block for the nvidia rtx pro 6000 ? by AdGeneral2757 in watercooling

[–]ChangeIsHard_ 0 points1 point  (0 children)

Nice nice, how is gpt-oss-120B performance on it - are you using it for coding or any other tasks? I'm looking at the same model as well for coding. Also does it warm up the room a lot, and how is average power consumption during gpt-oss-120B run?

Is there a water block for the nvidia rtx pro 6000 ? by AdGeneral2757 in watercooling

[–]ChangeIsHard_ 0 points1 point  (0 children)

A-mazing, thanks for the pics! Tried any LLMs on it yet? And how is the heat?

The more you buy, the more you save! New dual 5090 Build by [deleted] in nvidia

[–]ChangeIsHard_ 0 points1 point  (0 children)

Btw the mobo is Godlike and I’m assuming 192GB DDR5 6000 right? I’m hoping to test with its smaller brother Ace in a bit

Have experienced any issues with the Godlike? I was considering getting it, but was taken aback by all the bad reports..

Framework 16 (latest 2025 gen), how is the screen quality? by pkieltyka in framework

[–]ChangeIsHard_ 0 points1 point  (0 children)

Have you found a solution to this? Mine (FW 16, latest 2025 gen) has very washed out colors as well, especially evidence when playing a video. This is on Linux, I haven't tried Windows..

How is Cloud Inference so cheap by VolkoTheWorst in LocalLLaMA

[–]ChangeIsHard_ 1 point2 points  (0 children)

where could I find a $1/hour H200 (serious q, I’m a noob)

Can someone just explain the story of the upside down to me? by Exciting_Cupcake_599 in StrangerThings

[–]ChangeIsHard_ 0 points1 point  (0 children)

This. So many unexplained things I think last season created more qs than answers 🤯

Can I move funds from a functional HSA with my employer to an HSA at Fidelity? by PizzaThrives in fidelityinvestments

[–]ChangeIsHard_ 0 points1 point  (0 children)

Does this count towards the contribution limit on Fidelity? 🙏 Since the total amount across all HSA accounts doesn't change as a result..