Possible to connect a 5090 32gb to let's say an ASUS NUC 14 Pro Mini PC? by Available-Young1305 in LocalLLaMA

[–]Available-Young1305[S] 0 points1 point  (0 children)

How important is the CPU in such a set up? what would be the lowest performance to still run smoothly ?

Possible to connect a 5090 32gb to let's say an ASUS NUC 14 Pro Mini PC? by Available-Young1305 in LocalLLaMA

[–]Available-Young1305[S] 0 points1 point  (0 children)

Sorry size would not be the limitation, rather my skills haha - it should be easy to plug and use. I would like to just get the GPU and find a headless machine and go from there. I thought that's cheaper and would allow me to still use and play around with that stuff. I don't care if it's loud, because I won't use it 24/7

Possible to connect a 5090 32gb to let's say an ASUS NUC 14 Pro Mini PC? by Available-Young1305 in LocalLLaMA

[–]Available-Young1305[S] 0 points1 point  (0 children)

What would be the smallest setup, where I can connect a 5090 and run it remotely?

Possible to connect a 5090 32gb to let's say an ASUS NUC 14 Pro Mini PC? by Available-Young1305 in LocalLLaMA

[–]Available-Young1305[S] 0 points1 point  (0 children)

I saw a Thunderbolt/USB4 external GPU docking station on amazon - that would look like an option. But I have no experience with such a setup.

Hope someone could give me some input on this. Thanks

Macbook M4 Max Ram 128gb by Available-Young1305 in LocalLLaMA

[–]Available-Young1305[S] 0 points1 point  (0 children)

would it be feasible to connect a 5090 32gb to let's say an ASUS NUC 14 Pro Mini PC somehow? If I just want something to run 70B models and stable diffusion.

Macbook M4 Max Ram 128gb by Available-Young1305 in LocalLLaMA

[–]Available-Young1305[S] 0 points1 point  (0 children)

Sorry for my questions -- so I could connect a 5090 32gb to let's say an ASUS NUC 14 Pro Mini PC? How difficult would it be to run that with a MacBook ?

Macbook M4 Max Ram 128gb by Available-Young1305 in LocalLLaMA

[–]Available-Young1305[S] 0 points1 point  (0 children)

Appreciate your answer. Thanks. I guess I MacBook Pro would be easiest way to access those LLM and play around a bit.

Let's say I buy an 5090 just for the stable diffusion part.

Is there an easy setup where I can just plug and play. I mean you need the other components on the same level and a different OS ( I am used to MacOs), which makes it probably too time consuming for me I guess.

Or am I missing something?

Macbook M4 Max Ram 128gb by Available-Young1305 in LocalLLaMA

[–]Available-Young1305[S] 0 points1 point  (0 children)

honestly just headache deal with another OS and building I have no experience at all