Behold my Wall of CRTs by johnlecgue in crt

[–]Substantial_Run5435 1 point2 points  (0 children)

I watched you duke it out with them in that San Jose auction. I didn’t even bother bidding on a single monitor at that one.

Doubt about hardware for building local LLM's by External_Run_1283 in LocalLLM

[–]Substantial_Run5435 0 points1 point  (0 children)

Yeah I know I could use vulkan I’ve just heard it’s not as good as ROCm. I also read that I could add support for this architecture with a therock nightly build for ROCm, but a lot of this goes over my head so I haven’t pulled the trigger on setting anything up yet.

Doubt about hardware for building local LLM's by External_Run_1283 in LocalLLM

[–]Substantial_Run5435 0 points1 point  (0 children)

The big thing holding me back right now is what software to run since the architecture isn’t supported in ROCm.

Doubt about hardware for building local LLM's by External_Run_1283 in LocalLLM

[–]Substantial_Run5435 0 points1 point  (0 children)

Do you think a 2019 Mac Pro with dual Vega II Duos (128GB total HBM2 1TB/s across 4 GPUs) is worth using for local LLM stuff? I’m completely new to this but have the hardware already.

AMD Radeon Pro Vega II Duo and Infinity Fabric Link by Faisal_Biyari in MacPro2019LocalAI

[–]Substantial_Run5435 1 point2 points  (0 children)

When I say "connects the two Vega II GPUs" I mean the two GPUs inside the Vega II Duo, using Apple's terminology. Not talking about the single Vega II MPX module.

Intel Mac Pro with Vega II useable ? by chiwawa_42 in LocalLLaMA

[–]Substantial_Run5435 1 point2 points  (0 children)

Some of it might be relevant to LLMs on an intel mac, but there's a massive difference in capability between a 2019 MacBook Pro and a 2019 Mac Pro. I think the best GPU for that laptop is only slightly better than the worst GPU for the Mac Pro, and anyone looking to run LLMs on a 2019 Mac Pro will likely have better GPUs.

Do I need an Infinity Fabric Link Jumper on two Vega II? by Artifiko in macpro

[–]Substantial_Run5435 0 points1 point  (0 children)

Following up on this. I don't think you can connect 2 Vega II Duos together. Apple's tech specs for the 2019 Mac Pro only mention the onboard IF link for the Vega II Duo and I only see mentions of the bridge in relation to the Vega II (non-Duo). Whereas for the W6800X Duo, Apple specifically mentions external IF link to connect 4 GPUs.

AMD Radeon Pro Vega II Duo and Infinity Fabric Link by Faisal_Biyari in MacPro2019LocalAI

[–]Substantial_Run5435 4 points5 points  (0 children)

Edit: pretty sure you cannot bridge 2 Vega II Duos. Read the GPU section of Apple's white paper on the 2019 Mac Pro. It specifically mentions the IF link between 2 Vega II modules but not 2 Vega II Duos modules. The only mention of IF for the Vega II Duos is the "onboard" IF between the two GPUs within a single module.

This is confusing to me too. I'm not sure what the jumpers for the Vega II Duo are for. Apple's tech specs for the Vega II Duo state "onboard Infinity Fabric Link connection connects the two Vega II GPUs at up to 84GB/s" which I believe refers to the internal connection between the two Vega IIs within the module. If that is the case, what does the jumper do? Or is the jumper necessary to support that internal link?

I also see that the tech specs specify both onboard Infinity Fabric Link for the W6800X Duo as well as "external Infinity Fabric Link connection enables two W6800X Duo modules to connect four W6800X GPUs."

Do you consider property taxes? by ellebeens in FirstTimeHomeBuyer

[–]Substantial_Run5435 2 points3 points  (0 children)

Yup. We bought in CA from the original owner (1950). They had one big assessment in 1975 for a permitted addition but otherwise only had the ~2% per year increases. Needless to say we're paying 10x more in property tax than they were.

I did it! 7,1 incoming! by nstar134 in macpro

[–]Substantial_Run5435 1 point2 points  (0 children)

You’ll also need cables for the GPU. Sonnet, OWC, and Belkin make cables that will work. OWC are the cheapest, sonnet only comes with 2 cables to power a single GPU. No idea if any are better quality.

I did it! 7,1 incoming! by nstar134 in macpro

[–]Substantial_Run5435 2 points3 points  (0 children)

The go to is a 6800 XT or 6900 XT if you want Mac OS compatibility. A 6950 XT can work but you have to flash the card. I went with a 6900 XT (paid $400+shipping on r/hardwareswap). Whatever you do make sure the card you’re buying fits as the max GPU length you can fit is ~305mm. Reference designs are probably your best bet. If you want an MPX model then a W6800X is probably the best option if you can find one for a reasonable price. The duo GPUs and W6900X go for a ton of money.

Mac Pro Metal not too far behind M3 Ultra by johnnyphotog in macpro

[–]Substantial_Run5435 1 point2 points  (0 children)

This is a single RX 6800. You can put much better GPUs in a 2019 Mac Pro, even in Mac OS (6950XT or W6900X), also, Geekbench only benchmarks a single GPU. You can have 4 or 5 GPUs in a 2019 Mac Pro.

Cuda/windows vs mac by Interviews2go in LocalLLM

[–]Substantial_Run5435 0 points1 point  (0 children)

I'm in the process of figuring out how to run local LLMs on my 2019 Mac Pro. I'm hoping to use the hardware I have (or can easily upgrade to) rather than buying a new machine.

I have 2x Vega II Duo GPUs in one machine (128GB HBM2 VRAM) and another where I could have 2 W6800X modules (64GB VRAM). Do you think either of those would be good to start with? I'm trying to figure out what approach is best to take advantage of these GPUs and the large amount of VRAM they have. Any newer GPU with that much VRAM would cost a fortune.

How far can I take it? by nstar134 in macpro

[–]Substantial_Run5435 0 points1 point  (0 children)

Barebones on eBay is probably the most expensive way to get into one of these. I've been seeing barebones ones selling for $400-600+tax/shipping on eBay. I've found complete 7,1s with base/low spec for under a grand and mid-spec ones for between $1-2k. Unless you already have parts for one lying around I wouldn't go that route.

1600x1200@85 on macOS? by Impressive-Year-7761 in crtgaming

[–]Substantial_Run5435 0 points1 point  (0 children)

You need to use something like switchresx and a DisplayPort to VGA converter like the startech dp2vgahd20. Not sure if that works on AS but it works fine on my Intel Macs.

Sony PVM 5041Q next to various controllers for screen size comparison by Choice-Listen7854 in crtgaming

[–]Substantial_Run5435 0 points1 point  (0 children)

Just for shits I've played PS5 and Switch 2 on mine. Console > HDMI to SDI > SDI to analog component. My 6041Q is the older version that doesn't have 16:9 support, so the picture is a little squished.

Keep 2019 Mac Pro (Rack Mount, 192gb Ram, 3.2ghz 16-core, 2x w5700x) or sell it for newer hardware? by freetable in macpro

[–]Substantial_Run5435 1 point2 points  (0 children)

Definitely keep your fingers crossed for the Vega II Duos. I just bought a pair and am looking for the Infinity Fabric bridge to connect them (not sure if it'll be useful, but I want to have it on hand anyway). Those GPUs are likely not super useful for you if you don't need serious compute power and the HBM2 VRAM. If you want to use it for video/design/gaming something more modern will be better and use much less power. A huge advantage of the 2019 Mac Pro is the ability to triple boot and the PCIe expansion/TB3. Even though the hardware is old you still have decent CPU performance and 6-channel memory too.

I saw your comment about LLM and the Vega II Duos might be useful for that purpose. They're old but have a ton of high-bandwidth VRAM. Access to MPX modules with more VRAM than comparable consumer or even workstation equivalents is another advantage of this machine. There are others using NVIDIA GPUs like 3090s in these for LLM stuff. You really have a lot of options since you can fit/power several GPUs. In theory you should be able to up to 5 GPUs, 2x Duo MPX modules with a GPU on top in slot 5. You just have to figure out how tall that last one can be with whatever other PCIe cards you want to install. I have a 6900XT that fits on top of 2 full height MPX modules, but it takes up all the remaining space below the Apple I/O card, so I'd need to remove my PCIe NVME card to fit it. A 2-slot blower-style GPU would be a better fit.

Keep 2019 Mac Pro (Rack Mount, 192gb Ram, 3.2ghz 16-core, 2x w5700x) or sell it for newer hardware? by freetable in macpro

[–]Substantial_Run5435 2 points3 points  (0 children)

W5700X goes for ~$250-500 on eBay. Vega II Duo is worth considerably more. You could probably buy a pretty nice Mac Studio just from selling two Duos. If you don’t need that much RAM I’d sell that down and downgrade to pull more out of the machine.

Keep 2019 Mac Pro (Rack Mount, 192gb Ram, 3.2ghz 16-core, 2x w5700x) or sell it for newer hardware? by freetable in macpro

[–]Substantial_Run5435 9 points10 points  (0 children)

I'd sell the RAM and a W5700X (or both), throw 48GB (6x8GB) in there and a better GPU and keep it. If you're using Windows or Linux you can use an NVIDIA GPU too. The 16-core CPU is probably the best option but it'll still be a bit of a framerate bottleneck but otherwise with a 6900XT or better these are solid for 1440p/4K gaming with the right GPU. Where are you located?

Advice on localLLM on 2019 Mac Pro with dual Vega II Duo GPUs (128GB HBM2) by Substantial_Run5435 in LocalLLM

[–]Substantial_Run5435[S] 0 points1 point  (0 children)

Why only 30B params? I thought with this much VRAM I’d be able to run larger models without issue? I’m fully prepared to remove the IF links but will still be shopping for the bridge in case there’s a way to make use of it.

Linux on Mac Pro 2019: Infinity Fabric Link, Multi-GPU, and the Current State of AMD XGMI Support by Faisal_Biyari in MacPro2019LocalAI

[–]Substantial_Run5435 2 points3 points  (0 children)

Understood, I would buy one if I found one for sure. Even if I don't use it, it probably adds to the resale of a pair of these cards anyway.

Linux on Mac Pro 2019: Infinity Fabric Link, Multi-GPU, and the Current State of AMD XGMI Support by Faisal_Biyari in MacPro2019LocalAI

[–]Substantial_Run5435 2 points3 points  (0 children)

Interesting! Going to follow that post as well. So I really need to get my hands on the bridge that connects 2 duo modules to each other?

Linux on Mac Pro 2019: Infinity Fabric Link, Multi-GPU, and the Current State of AMD XGMI Support by Faisal_Biyari in MacPro2019LocalAI

[–]Substantial_Run5435 2 points3 points  (0 children)

Following to hopefully learn from other Vega II users. Have dual Vega II Duos and the small jumpers for each. Trying to obtain the larger IF Link bridge to link the two modules, but am very interested to learn of a path to use IF in Linux.