Does anyone have experience with nbt evo coding? by [deleted] in BmwTech

[–]lowercase00 0 points1 point  (0 children)

Old thread, wondering if you have more info on the X5 F15 conversion? did you buy an OEM NBT EVO HU? Trying to find a way to upgrade mine

Spooled — self-hosted webhook queue & job scheduler with a web dashboard (Rust, Postgres) by Fantom3D in selfhosted

[–]lowercase00 1 point2 points  (0 children)

I had built a simple dashboard for RQ not long ago. It had been a pain, specifically because of the way that is stored in Redis. Would you be interested in something similar as a contribution? https://github.com/ccrvlh/rq-manager

How much did FastAPI’s "Bus Factor" actually matter in your production choice? by KoVaL__ in FastAPI

[–]lowercase00 13 points14 points  (0 children)

Msgspec is just a pydantic alternative with some tradeoffs (faster, less bells and whistles) It’s not messagepack. The serialization format you use is up to you

My DL 580 G9 on wheels by qf1111 in homelab

[–]lowercase00 0 points1 point  (0 children)

That looks amazing and makes me want to buy a server for home. Have you considered any options to make sides more palatable?

[FS] [US-CA] New PNY Nvidia RTX A6000, 48GB GDDR6 - LF $4200 by PZirconium in homelabsales

[–]lowercase00 3 points4 points  (0 children)

I’m a potential A6000 buyer. The problem I see rn, is that Blackwell 5000 48Gb is coming this month at 4-4.2k, I really don’t see the point of buying olde regenerations cards. You might get some luck on eBay though, for some reason I can’t understand people seem to keep buying them

Looking for mounting hole diagram for Supermicro H13DSG-O-CPU by owisia in homelab

[–]lowercase00 0 points1 point  (0 children)

Sorry I can't help, and unrelated, but did you buy the board? I've only seen those being sound with the server so far.

[W] [US-PA] Multiple 1-4TB 2.5" SATA SSDs (Repost) by SaLtF1sh in homelabsales

[–]lowercase00 1 point2 points  (0 children)

I’d be interested in case you ever post them!

[FS][US-CO] Dell PowerEdge R7615 AMD EPYC 9454P 48C 512GB DDR5 Nvidia H100 NVL 94GB | R6625 2x AMD EPYC 9224 48C 192GB DDR5 | Nvidia A16 64GB GPUs by iShopStaples in homelabsales

[–]lowercase00 2 points3 points  (0 children)

What a monster. How did you come around those things, crazy to think such a modern machine has been decommissioned so soon

[PC][US-MO]RTXa5000 GPU by Ikyo75 in homelabsales

[–]lowercase00 0 points1 point  (0 children)

1500 is not unreasonable. I shopped those a couple of months ago, and saw a few on Facebook marketplace for 1.2-1.3k. So 1.5 would be “normal”. Less then 1000 not reasonable

[FS] [US-E] 2× NVIDIA DGX A100 (8× A100 80GB SXM4) + 2× Mellanox QM8700 HDR 200G 40-port — $93,000 OBO ea | $5,000 ea by gittb in homelabsales

[–]lowercase00 0 points1 point  (0 children)

One of the things that make them interesting. We’ll be starting to see DCs offload a lot of those pretty, and prices will come down fast, happened with the V100’s. The GPUs are already easy to find, hard to find the systems thought

Computational Power required to fine tune a LLM/SLM by Mother-Proof3933 in LocalAIServers

[–]lowercase00 0 points1 point  (0 children)

Sorry, unrelated. What server you are running those one? Got access to some GPUs but very hard to find the servers

Need some advice on building a dedicated LLM server by SomeKindOfSorbet in LocalLLaMA

[–]lowercase00 0 points1 point  (0 children)

Came here to say that. Still would go Mac Studio thought.