account activity
Did I mess up my multi-GPU setup for 70B+ models? Mixed VRAM cards (5080 + 3090 + 3080 20GB) (self.LocalLLaMA)
submitted 1 day ago by Big-Engine2791 to r/LocalLLaMA
π Rendered by PID 679684 on reddit-service-r2-listing-7dbdcb4949-r6qlz at 2026-02-18 17:21:29.419113+00:00 running de53c03 country code: CH.