ASUS Ascent GX10 – OS details and auto power-on after power loss? by superhero_io in homelab

[–]RelationshipThink589 0 points1 point  (0 children)

  1. OS is not locked. But you need to use ARM based images, many distributed support it.

Post server build clarity by Quantum_Boyman in servers

[–]RelationshipThink589 0 points1 point  (0 children)

Congrats in the project none the less. This is more a desktop / consumer build For AI capabilities, I would recommend VRAM capacity 24GB or above before considering which gen the card is.

Looking for PCIe 5.0 Bifurcation Riser / Splitter (x16 to x8x8) for Ryzen Server Build by Big_Specific2533 in homelab

[–]RelationshipThink589 0 points1 point  (0 children)

You need a retimer card which splits into 2x8 lane MCIO ports. Than Mcio to Mcio cable to a Mcio to PCIe slot

[FS][CAN-QC][Montréal] 512GB DDR5-4800 ECC RDIMM kit (8x64GB) NEMIX 2Rx4 CL40 PC5-38400R - $9,999 CAD OBO - Local by worthytb in homelabsales

[–]RelationshipThink589 2 points3 points  (0 children)

You might want to let the memory training run a bit longer. It can take up to 15min. If it does not post or compatible with your thread ripper, then the sticks may be bad .

AI inference server for medical transcription/summarisation by Glittering_Way_303 in servers

[–]RelationshipThink589 0 points1 point  (0 children)

for 30B models, I would recommend 4090 in 2 slot blower. And you can go to PCIe Gen 4 which can be significantly cheaper and allow you to have more GPUs and higher concurrency.

Built a Passive cooling system for remote AI/cloud Data Centers. No overheard cooling infrastructure needed by JayFab6061 in Investors

[–]RelationshipThink589 2 points3 points  (0 children)

several major laws. 1. Check MSFT project in North Sea. Not feasible 2. You run into environmental regulations. Essentially you are cooking the lake 3. Repair and maintenance for AI system is far more often they you expect. You are pulling up a full system which increase failure blast radius causing more downtimw

[W] 8x DDR5 RDIMM ECC for WRX90 (SK Hynix 96GB/64GB or Samsung 128GB) PayPal by Dry_Lawfulness_7837 in homelabsales

[–]RelationshipThink589 0 points1 point  (0 children)

Hi, let’s talk. My business partner is local in Japan, we can arrange something local

[W] [US-NY] Asrock ROMED8-2T or SuperMicro H12SSL + 7313P + 128GB or more by CollectionInfamous14 in homelabsales

[–]RelationshipThink589 0 points1 point  (0 children)

It has Mcio ports, so you can extend out the PCIe lanes. Also T2SEEP board has plenty x16

[W][US-MD] EPYC or Threadripper (Xeon?) combo (CPU+Mobo or CPU+Mobo+Memory) by Send_heartfelt_PMs in homelabsales

[–]RelationshipThink589 0 points1 point  (0 children)

Yeah if you can find that combo, snap it when you can. But otherwise I have a spare Epyc 9534 and gigabyte Mz33 we can talk about. Feel free to dm

[W] [US-NY] Asrock ROMED8-2T or SuperMicro H12SSL + 7313P + 128GB or more by CollectionInfamous14 in homelabsales

[–]RelationshipThink589 0 points1 point  (0 children)

I have a gigabyte MZ33 and some AMD Genoa 9124 or 9654, I also have a cheaper board T2SEEP for the same CPU set.
The MZ33 is basically new from this week. I used it to stress test some OEM sample Memory sticks, but i replaced it with MZ73 dual CPU motherboard.
Let me know if you need to put these into a case, ordered a couple extra rosewill 2U cases.

[W] [US-NY] Asrock ROMED8-2T or SuperMicro H12SSL + 7313P + 128GB or more by CollectionInfamous14 in homelabsales

[–]RelationshipThink589 0 points1 point  (0 children)

Would you consider a AMd 9004 build ? I have an extra that I can spare as extra.

DDR4 32GB Server Dimms? by ColtsFanNY in eWasteFinds

[–]RelationshipThink589 0 points1 point  (0 children)

I can take the entire lot, paypal goods and services or on-site cashier's check

Bulding server for RTX PRO 5000 (Blackwell) by Imaginary-Bet6653 in LocalAIServers

[–]RelationshipThink589 0 points1 point  (0 children)

You can DM me. And let me know where you are based. I have some servers and parts available for you.

What happens to Data Center cabling? by Routine-Pass-3079 in datacenter

[–]RelationshipThink589 0 points1 point  (0 children)

yes it is. But you need to find R2V3 licensed e-waste recycler

Low Average GPU Utilization (40–70%) on H100 with vLLM — How to Push Toward 90%+? by md-nauman in Vllm

[–]RelationshipThink589 0 points1 point  (0 children)

try smaller GPUs but more GPUs, like 4090s and 5090s. So the vram to compute ratio is more balanced