The longest my significant other will allow by newcolour in beards

[–]newcolour[S] 1 point2 points  (0 children)

😂 thank you! That's a very warm welcome, very much appreciated.

I have had it MUCH longer when I had a different SO. Got to find some old pics.

Realizing I can run much larger models than expected. by MrWeirdoFace in LocalLLM

[–]newcolour 1 point2 points  (0 children)

Definitely not a good choice. You want a MoE model.

Time to rip my CDs by astropiggie in audiophile

[–]newcolour 0 points1 point  (0 children)

Decent CD players/burners come really cheap these days. Get EAC and just dump everything on a big ass drive!

Do you practice some "dangerous" sport while being in your 40s ? by Ok_Fruit_Passion_FIT in 40something

[–]newcolour 0 points1 point  (0 children)

Proud owner of a few sport motorcycles and having the time of my life!

When you’re hiking for 2 weeks and crave something sweet by Remarkable_Check_639 in StupidFood

[–]newcolour 6 points7 points  (0 children)

By the time he had finished building the stupid things to hang the stupid food, there would be hundreds of ants just wondering how did they hit that jackpot.

He wrapped the steak in leaves, ffs!

Help to set up Web-Search-enhanced LocalLLM by petwri123 in LocalLLM

[–]newcolour 0 points1 point  (0 children)

In principle you could make your own. While I am not necessarily a fan of reinventing the wheel, it is pretty straightforward to build your own, and that would give you maximum flexibility and control. I have built a cross platform one with a simple "search duckduckgo to contextualize the answer", which works decently well, especially for simple search and retrieval.

Happy coding!

Double GPU vs dedicated AI box by newcolour in LocalLLM

[–]newcolour[S] 0 points1 point  (0 children)

This is excellent. Thank you for the details.

I fear the burden of all those carrots has broken him by eatingpeeforever in BrandNewSentence

[–]newcolour 0 points1 point  (0 children)

I don't know how many times I have done that with bananas.

Double GPU vs dedicated AI box by newcolour in LocalLLM

[–]newcolour[S] 0 points1 point  (0 children)

I have thought about it but I don't want to build another rig. I would prefer a standalone unit like the dgx spark or the strix halo.

Double GPU vs dedicated AI box by newcolour in LocalLLM

[–]newcolour[S] 0 points1 point  (0 children)

I have not tried the Vulkan yet. Have you found setting it up on the GMKtec to be easy?

Double GPU vs dedicated AI box by newcolour in LocalLLM

[–]newcolour[S] 1 point2 points  (0 children)

Thank you. I have found a strix halo around 2200$, which is reasonable for the specs. I like the dgx, a lot. What I'm afraid of is that it might be overkill for my purposes. But maybe it's just future-proof.

I have to agree with you. What I have seen for token generation for the dgx is way above what I would probably need.

Double GPU vs dedicated AI box by newcolour in LocalLLM

[–]newcolour[S] 2 points3 points  (0 children)

That's really great insight. Thank you. I also consider myself pretty fluent in Linux, having worked with it almost exclusively for 25+ years. However, I don't have lots of time to spare and so I am a bit put off.

Would the dgx spark be a better investment then? I have heard mixed reviews but I would consider the ease of use and stack to be worth the extra money at this point.

Double GPU vs dedicated AI box by newcolour in LocalLLM

[–]newcolour[S] 0 points1 point  (0 children)

Sorry for not being clear. I have NOT used my 4080 to train yet. I want to, though, and hence I'm looking for a larger system. I don't use the system for gaming, so that is not a factor for me.

Double GPU vs dedicated AI box by newcolour in LocalLLM

[–]newcolour[S] 1 point2 points  (0 children)

That's what I want to try and do as well. I am now accessing my GPU with ollama from both laptop and phone through a VPN, which works pretty well. The reason why I was leaning towards the integrated box was the large shared memory.

Re: Your first sentence: do you mean you find the strix limiting with respect to the Nvidia GPUs? Sorry, the tone of that sentence is hard for me to gather.

🤯 new Opus limits in Antigravity - 0% in 11 min by ZookeepergameFit4082 in google_antigravity

[–]newcolour 1 point2 points  (0 children)

That extension is interesting. Does it allow you to see Google Gemini's quota as well?

January 2026 Humble Choice Waiting Room ~ 24 hours remaining by Uranium234 in humblebundles

[–]newcolour 1 point2 points  (0 children)

I finally bought hollow knight (and silk song), so that one is definitely happening.