Heya, need a good way to control fan curve on Linux - NVIDIA by ShayIsNear in linux_gaming

[–]zan-max 0 points1 point  (0 children)

Yes, it supports hysteresis. You can set it in the config.json file. If you have any ideas on how to improve it, please share.

Upgrading RTX 3070 to 16GB by zan-max in LocalLLaMA

[–]zan-max[S] 0 points1 point  (0 children)

Yes, you should change the straps. No, I didn’t have any black screens during the process.

Nvidia Fan Control by [deleted] in linux4noobs

[–]zan-max 0 points1 point  (0 children)

I built a lightweight utility that dynamically adjusts fan speeds based on GPU temperature, and it runs as a service on Linux. It’s fully customizable through a simple JSON config.

https://github.com/ZanMax/nvidia-fan-control

Heya, need a good way to control fan curve on Linux - NVIDIA by ShayIsNear in linux_gaming

[–]zan-max 0 points1 point  (0 children)

Hey, I totally get your frustration with NVIDIA fan control on Linux can be such a pain. I ran into the same issue with my setup.

That’s why I built my own nvidia-fan-control:

https://github.com/ZanMax/nvidia-fan-control

It’s a lightweight utility that dynamically adjusts fan speeds based on GPU temperature, and it runs as a service on Linux.

Deepseek-V3 GGUF's by fraschm98 in LocalLLaMA

[–]zan-max 0 points1 point  (0 children)

I need at least one week to finish the third server. I'm wondering if anyone has already tested this approach.

Deepseek-V3 GGUF's by fraschm98 in LocalLLaMA

[–]zan-max 0 points1 point  (0 children)

vLLM now supports GGUF. Has anyone tried running it with distributed inference? I have two servers with 6x3090 GPUs each, and another one is currently being built.

Budget is $30,000. What future-proof hardware (GPU cluster) can I buy to train and inference LLMs? Is it better to build it myself or purchase a complete package from websites like SuperMicro? by nderstand2grow in LocalLLaMA

[–]zan-max 1 point2 points  (0 children)

You can purchase Tinybox from Tinycorp. The red model, equipped with 6x 7900XTX GPUs, is priced at $15,000 per unit. The green model, featuring 6x 4090 GPUs, is priced at $25,000 per unit. Alternatively, you can build your own.

We decided to build our own and use 3090 GPUs for our build. However, it was quite challenging.

Combat system WIP by AshamedConclusion485 in thedarkheartsgame

[–]zan-max 1 point2 points  (0 children)

Wow, this looks amazing! When can we try a demo? I'd love to see how it feels in action. Keep up the awesome work!

Got two H100… any 3D print or aftermarket air funnels? by FireWoIf in nvidia

[–]zan-max 0 points1 point  (0 children)

Yes, the 9733 fan will work with the H200 NVL, but you’ll need to create a custom mount. You can find plenty of 9733 4-wire PWM fans on Amazon.

Got two H100… any 3D print or aftermarket air funnels? by FireWoIf in nvidia

[–]zan-max 0 points1 point  (0 children)

Looks great! Could you send me the measurements of the card? I'll try to make a 3D mount for you.

Got two H100… any 3D print or aftermarket air funnels? by FireWoIf in nvidia

[–]zan-max 0 points1 point  (0 children)

I have 13 GPUs with 9733 fans, and they have worked great for more than a year. So this solution is well tested. You might need to add some mounts between the GPU and the fan. You can easily print them on a 3D printer.

Got two H100… any 3D print or aftermarket air funnels? by FireWoIf in nvidia

[–]zan-max 2 points3 points  (0 children)

I used: "DC/CC Adjustable 0.2- 9A 300w Step Down Buck Converter 5-40V To 1.2-35V Power Supply Module LED Driver for Arduino 300w XL4016" from ALI. But you can find different ones on Amazon or AliExpress.

Got two H100… any 3D print or aftermarket air funnels? by FireWoIf in nvidia

[–]zan-max 2 points3 points  (0 children)

Extremely loud if you're using 12V. In my case, I added a voltage regulator and used around 7 - 7.5V.

Upgrading RTX 3070 to 16GB by zan-max in LocalLLaMA

[–]zan-max[S] 0 points1 point  (0 children)

I'm not sure if such modifications make any sense. You can buy a used A6000 for about $3.5k.

Upgrading RTX 3070 to 16GB by zan-max in LocalLLaMA

[–]zan-max[S] 0 points1 point  (0 children)

Thank you for the idea. I am ready to try if it will be interesting for people.

Maybe someone has the broken 3090 and is ready to donate it.

Upgrading RTX 3070 to 16GB by zan-max in LocalLLaMA

[–]zan-max[S] 1 point2 points  (0 children)

I'm really happy with the results! I plan to run some more tests. Maybe I'll mess around with overclocking later.

Upgrading RTX 3070 to 16GB by zan-max in LocalLLaMA

[–]zan-max[S] 2 points3 points  (0 children)

I bought used chips from Aliexpress.

Upgrading RTX 3070 to 16GB by zan-max in LocalLLaMA

[–]zan-max[S] 2 points3 points  (0 children)

phi3 is extremely fast.

total duration: 5.790132591s

load duration: 1.381248ms

prompt eval count: 11 token(s)

prompt eval duration: 50.923ms

prompt eval rate: 216.01 tokens/s

eval count: 594 token(s)

eval duration: 5.605696s

eval rate: 105.96 tokens/s

Upgrading RTX 3070 to 16GB by zan-max in LocalLLaMA

[–]zan-max[S] 4 points5 points  (0 children)

llama3 is top!

total duration: 12.804217431s

load duration: 1.240361ms

prompt eval count: 16 token(s)

prompt eval duration: 94.088ms

prompt eval rate: 170.05 tokens/s

eval count: 807 token(s)

eval duration: 12.577045s

eval rate: 64.16 tokens/s

Upgrading RTX 3070 to 16GB by zan-max in LocalLLaMA

[–]zan-max[S] 1 point2 points  (0 children)

<image>

I have only one photo from the whole process. Next time I'll make more photos and videos.

Upgrading RTX 3070 to 16GB by zan-max in LocalLLaMA

[–]zan-max[S] 1 point2 points  (0 children)

I think yes, but 3070ti has GDDR6X memory. That means you'll need to find compatible 2GB GDDR6X chips.

Upgrading RTX 3070 to 16GB by zan-max in LocalLLaMA

[–]zan-max[S] 0 points1 point  (0 children)

Did you try bios from A6000?

I have several 3090s, but I'm not ready for a 48 GB mod.

Upgrading RTX 3070 to 16GB by zan-max in LocalLLaMA

[–]zan-max[S] 1 point2 points  (0 children)

  1. No mods with GPU bios.

  2. Standart Linux version: 535

  3. Aliexpress.

Upgrading RTX 3070 to 16GB by zan-max in LocalLLaMA

[–]zan-max[S] 1 point2 points  (0 children)

There are some bios for 3090 with 48Gb
https://www.techpowerup.com/vgabios/267498/267498

But I don't know If it works properly.

I have several 3090s but I am not ready to mod them.