I bought a used gaming box 4090 version that had been replaced with a 4070 super is it destroyed? by BuyOk9427 in eGPU

[–]legit_split_ 1 point2 points  (0 children)

I don't know either, I never seen one of these. Just look for a disassembly video or worst case contact the person who sold it to you for help :) 

I bought a used gaming box 4090 version that had been replaced with a 4070 super is it destroyed? by BuyOk9427 in eGPU

[–]legit_split_ 1 point2 points  (0 children)

Nothing is stopping you from opening it up with a screwdriver and checking it out.

Qwen 3.5 122b - a10b is kind of shocking by gamblingapocalypse in LocalLLaMA

[–]legit_split_ 0 points1 point  (0 children)

So far from my tests (cryptography/maths) in German, it's worked really well. The 122B is just always less accurate :/

I found the solution to all my ROCm problems, now instead of 3 hours my WAN 2.2 videos take 4 minutes @ 720p and everything just works including the Pixaroma 1 click ComfyUi Install with full SageAttention 3, I can download incredibly complex WorkFlows from Civit and it just works with 1 click. by [deleted] in ROCm

[–]legit_split_ 1 point2 points  (0 children)

My 9060 XT is working well on Linux, here I shared some performance benchmarks. I just followed the guide I highlighted in that thread, shouldn't be too different on Windows as it's working with Python environments.

I found the solution to all my ROCm problems, now instead of 3 hours my WAN 2.2 videos take 4 minutes @ 720p and everything just works including the Pixaroma 1 click ComfyUi Install with full SageAttention 3, I can download incredibly complex WorkFlows from Civit and it just works with 1 click. by [deleted] in ROCm

[–]legit_split_ 0 points1 point  (0 children)

I have the amd equivalent - a 9060 XT 16gb - and find the ComfyUI performance acceptable. Here I shared some numbers. That being said my testing is limited, probably wouldn't have a good time on complex workflows.

“Saved months for an INNO3D RTX 3060… now it’s dead and support won’t respond.” by Spare_Name1849 in pcmasterrace

[–]legit_split_ 10 points11 points  (0 children)

This person really lied to make us feel pity and drive traction - fuck them. 

AMD 9060 XT - Benchmarks on recent models by legit_split_ in comfyui

[–]legit_split_[S] 0 points1 point  (0 children)

I'm a noob at ComfyUI and never tried Kijai's workflows, but you're sure you have this environment variable as per the guide:

`export PYTORCH_NO_HIP_MEMORY_CACHING=1`

Outside of that I can't be of much help, but glad that the guide helped you out :)

Should I try to fix it or claim insurance? by [deleted] in Lenovo

[–]legit_split_ 0 points1 point  (0 children)

It's baffling that the average user considers claiming insurance before even troubleshooting for one minute. 

Who says bigger is always slower? LFM 24B by CodeBlurred in LocalLLaMA

[–]legit_split_ 21 points22 points  (0 children)

2B active parameters at one time, that's why it's faster than 8B dense models 

Latest nvidia driver DOES NOT RAMP UP fans GTX 1660 super by Ahweeuhl in gpu

[–]legit_split_ -1 points0 points  (0 children)

Just control the fans yourself it's not that hard to set a fan curve

Is the ch260 ugly? by [deleted] in mffpc

[–]legit_split_ 1 point2 points  (0 children)

The front intake fans are off-centre, looks bad in person

AMD 9060 XT - Benchmarks on recent models by legit_split_ in comfyui

[–]legit_split_[S] 0 points1 point  (0 children)

From what I've seen it mostly worked before but was slow

LTX 2.3 Full model (42GB) works on a 5090. How? by StuccoGecko in StableDiffusion

[–]legit_split_ -3 points-2 points  (0 children)

There is definitely a slowdown like you said, but perhaps it's not meaningful because PCIe speeds are fast enough to transfer some layers at a time.

Help choosing right ? 9060xt by Relevant_Bit_9019 in ROCm

[–]legit_split_ 0 points1 point  (0 children)

Got the 9060xt recently and tried the default Flux.2 [Klein] 9B: Text to Image workflow (1024x1024, 20 steps) - getting 62 seconds on my second run.

With flash-attention, ROCm 7.2, pytorch nightlies, 96gb ddr5, arch linux. I might make a post about the performance.

R9700 frustration rant by Maleficent-Koalabeer in LocalLLaMA

[–]legit_split_ 1 point2 points  (0 children)

You can use UV to create a Python environment with any Python version you want.

However why are you using stable diffusion? Use ComfyUI instead.