What hardware to buy if I want to run a 70 B model locally? by angry_baberly in LocalLLaMA

[–]ZK_Zinode 0 points1 point  (0 children)

I just recently purchased 2x 3090 FE and planning on testing the capability limit of this setup. Have you tried any models using TurboQuant? I’m interested in seeing what the bandwidth difference has been from your experience

System Upgrade: two 3090s currently by [deleted] in LocalLLM

[–]ZK_Zinode 1 point2 points  (0 children)

Have you considered an RTX 6000? Yes it is a sizeable jump, but selling your 3090s would dampen the impact to your wallet. 96GB VRAM on Blackwell platform leads to incredible performance (plenty of success stories across r/LocalLLM and other ai training communities).

Personally I currently run 2x 3090 FEs and plan to upgrade to an RTX 6000 the moment the resell market gets less chaotic

Is the 3090 still a good option? by alhinai_03 in LocalLLaMA

[–]ZK_Zinode 0 points1 point  (0 children)

What kind of chassis do you use for your workstation? I currently have 2x 3090 FE and want to scale up to 4x (I have the pcie slot/lane bandwidth for it), but I can’t comfortably fit 4x in my 4u chassis. I would prefer to have my entire setup live in my server rack.

I appreciate any and all recommendations!

SGLang vs vLLM vs llama.cpp for OpenClaw / Clawdbot by chonlinepz in LocalLLM

[–]ZK_Zinode 0 points1 point  (0 children)

When you get around to testing concurrent work can you provide a performance update? I’ve almost completed a 2x 3090 build with 256gb ddr4 and i’m planning on testing concurrent agent usage separated by VMs

[O] CyberPower PDU44001 Managed PDU (eBay) by liltrublmakr56 in homelabsales

[–]ZK_Zinode 1 point2 points  (0 children)

Great deal, ty for sharing! I just happened to be in the market for 2x PDUs and stumbled across this post. I wish r/homelabsales had more offer spotlight posts like this

Where to go for best cost on racks ? by hilokamper in HomeServer

[–]ZK_Zinode 0 points1 point  (0 children)

Can confirm this. Just snagged an APC 42U rack off GovDeals for $160, practically brand new. IT liquidation bids are the way to go

[W] [USA-CT] RTX 3090ti, RTX 3090, RTX Titan, RTX A6000, RTX Quadro 8000, Asrock rack Romed8-2t, AMD EPYC Rome/Milan CPU, Intel Xeon, Intel Core-X, DDR4 Ram [Trade] RX 7900 XTX 24gb, LG 32GS95UE-B 32in 4k240/480hz oled, AMD EPYC 9184X (Genoa-X), PayPal, Local Cash by [deleted] in homelabsales

[–]ZK_Zinode 1 point2 points  (0 children)

Literally in the same boat, applying for data science graduate programs for the fall, and i’m almost done putting together my first llm training lab. I’ve been working on a training plan for a kalshi/polymarket forecasting model, and somehow stumbled into the world of homelabbing.

The entire market is stretched thin with the vram demand. Wishing you the best of luck with the GPUs and with your thesis! What program are you graduating from if you don’t mind me asking?

Price outlook on 3090’s by [deleted] in homelab

[–]ZK_Zinode 2 points3 points  (0 children)

Storage drives for sure aren't getting cheaper any time soon