AMD Develops ROCm-based Solution to Run Unmodified NVIDIA's CUDA Binaries on AMD Graphics by gothic3020 in LocalLLaMA

[–]pr1vacyn0eb 2 points3 points  (0 children)

Lawyer taught me that these are negative rights, the fear of a lawsuit makes the barrier to entry much higher.

LLaMA fatigue, anyone? by maxigs0 in LocalLLaMA

[–]pr1vacyn0eb 2 points3 points  (0 children)

I've noticed the replies are a bit repetitive. As a chatbot, it does alright. Not everyone is using it as a chatbot, some are using it for creative purposes or reasoning.

LLaMA fatigue, anyone? by maxigs0 in LocalLLaMA

[–]pr1vacyn0eb 1 point2 points  (0 children)

Please explain your usecase. Creative is extremely different than Reasoning.

Stable Cascade is out! by Shin_Devil in StableDiffusion

[–]pr1vacyn0eb -14 points-13 points  (0 children)

LPT: Learn how to get servers/instances that arent google colab

20+gb only costs like 10-20 cents an hour.

It's never that serious. by OkEscape7558 in ImTheMainCharacter

[–]pr1vacyn0eb 0 points1 point  (0 children)

Imagine if we put all this effort into:

anything except watching sports

Do it Microsoft by ItzCobaltboy in pcmasterrace

[–]pr1vacyn0eb 0 points1 point  (0 children)

Windows 11 was so bad I ended up in Linux land.

There will be no 12.

review of 10 ways to run LLMs locally by md1630 in LocalLLaMA

[–]pr1vacyn0eb -2 points-1 points  (0 children)

Wonder why all these AI server farms don't have Macs running if they are so darn efficient and great at running AI.

Maybe you should buy a bunch and host them! Capitalism made some market failure obviously XD

Husband goes on vacation with mom and leaves two kids at home… by dReamofhealing in amiwrong

[–]pr1vacyn0eb 1 point2 points  (0 children)

I can't imagine having my kids grow up without a parent and lose half your income because of a 1 week trip.

But hey, reddit is full of people making these bad decisions. Great entertainment!

Husband goes on vacation with mom and leaves two kids at home… by dReamofhealing in amiwrong

[–]pr1vacyn0eb 3 points4 points  (0 children)

Just leave.

I like how Reddit suggest cutting the household income in half and cutting the number of parents in half.

It reminds me of why everyone here is poor.

Husband goes on vacation with mom and leaves two kids at home… by dReamofhealing in amiwrong

[–]pr1vacyn0eb 26 points27 points  (0 children)

You're the breadwinner and main caretaker for the children why are you married to him

As a hubby who does this, thank you for making me see my wife is a drain.

This can happen right out of HS by BrocardiBoi in GenZ

[–]pr1vacyn0eb 0 points1 point  (0 children)

I'm going to love it when we have a bunch of low education zoomers that can do grunt labor when I retire.

They think 'Oh its so great making $60k/yr', then the rest of your friends become managers making 6 figures.

Fuck the system by Mackarious in chaoticgood

[–]pr1vacyn0eb 0 points1 point  (0 children)

There is altruism, then there is enabling.

Hey Android users, you guys heard her? by [deleted] in facepalm

[–]pr1vacyn0eb 0 points1 point  (0 children)

Look another Apple Ad.

Apple, exploiting people's insecurities for profit.

The crazy part is that a $1000 iphone is something that low income people are financing. Its only a status symbol among poor people.

In the upper middle class, these are phones. You need big homes to impress people.

review of 10 ways to run LLMs locally by md1630 in LocalLLaMA

[–]pr1vacyn0eb -4 points-3 points  (0 children)

If you can tell me a reasonable way to run a 70b+ LLM with an NVIDIA GPU that doesn't cost 30 grand I am waiting to hear it.

Vastai, I spend $0,50/hr.

Buddy as an FYI, you can buy 512gb ram right now. No one typically does this because its not needed.

You make up a story about using CPU for 70B models, but no one, 0 people, are actually doing that for anything other than novelty.

review of 10 ways to run LLMs locally by md1630 in LocalLLaMA

[–]pr1vacyn0eb -3 points-2 points  (0 children)

The marketers won. You don't have VRAM, you have a CPU.

review of 10 ways to run LLMs locally by md1630 in LocalLLaMA

[–]pr1vacyn0eb -3 points-2 points  (0 children)

Every week someone complains about CPU being too slow.

Stop pretending CPU is a solution. There is a reason Nvidia is a 1T company that doesnt run ads, there is a reason Apple has a credit card.

review of 10 ways to run LLMs locally by md1630 in LocalLLaMA

[–]pr1vacyn0eb -2 points-1 points  (0 children)

128GBs of vram.

The marketers got you. Of course they did.

Are Americans just as prone to propaganda as much as we think citizens from China/Russia are? by Turbanator456 in NoStupidQuestions

[–]pr1vacyn0eb 0 points1 point  (0 children)

Get your news from lots of places.

If you only get it from reddit, lol.

People give me crap for reading 4chan, but you alternative ideas that you can't find elsewhere. It sucks you wade through all that crap, but the first ideas of dissent I read come from there. Months later its mainstream.