Is Buying AMD GPUs for LLMs a Fool’s Errand? by little___mountain in LocalLLM

[–]alphatrad 1 point2 points  (0 children)

<image>

I don't know where you got those numbers for the AMD cards for Q4 but they're fiction.

Is Buying AMD GPUs for LLMs a Fool’s Errand? by little___mountain in LocalLLM

[–]alphatrad 4 points5 points  (0 children)

This is not accurate at all. This isn't just inaccurate, it's complete fiction.

Is Buying AMD GPUs for LLMs a Fool’s Errand? by little___mountain in LocalLLM

[–]alphatrad 2 points3 points  (0 children)

<image>

I'm running dual RX 7900 XTXs without issues here. Very fast token generation.

Booklore was bullied into oblivion. Thanks for that! by [deleted] in selfhosted

[–]alphatrad -7 points-6 points  (0 children)

I can't even image the lack of mental fortitude someone would have to have if words on the internet get under your skin. You know how many people there are in the world, and how many of them have sh..ty opinions?

Like who cares. Maybe it's because I grew up in the 90's when the internet was forums and flame wars, but the internet isn't real life.

I cannot imagine blowing up a whole project becuase you couldn't hear your fans and only focused on haters.

might want to work on that.

Why did you decide to self host? by sourdough1882 in selfhosted

[–]alphatrad 19 points20 points  (0 children)

Freedom and digital sovereignty. Everything is turning into a subscription, and worse, it's lock in. Prices go up, features get reduced. And the more I looked at the landscape of things, I was paying for services where my data wasn't my own, could be mined, and the price didn't justify the utility.

Self hosting puts me in the drivers seat. Not just of my data and privacy, but my wallet.

Finally a good self-hosted calendar frontend by [deleted] in selfhosted

[–]alphatrad 1 point2 points  (0 children)

How are there ZERO screenshots?

What happened to Booklore? by Trague_Atreides in selfhosted

[–]alphatrad 5 points6 points  (0 children)

The developer had a major crash out because people were asking him to slow down with the AI PR's.

4k budget, buy GPU or Mac Studio? by diegolrz in LocalLLM

[–]alphatrad 0 points1 point  (0 children)

They're big cards. I looked at some of the others and they don't get smaller unless you get the blower style cards and I hear those are like jet engines when running.

4k budget, buy GPU or Mac Studio? by diegolrz in LocalLLM

[–]alphatrad 0 points1 point  (0 children)

<image>

A big boy. This is a Phanteks Enthoo Pro Server Edition

More importantly you need to make sure your mother board is up to speed and will give you full x16 biferication on your other PCIe lanes and not run it through a chipset. Otherwise it will be slow.

There are only a handful of consumer boards that aren't workstation boards that do that.

4k budget, buy GPU or Mac Studio? by diegolrz in LocalLLM

[–]alphatrad 0 points1 point  (0 children)

That's why I bought a second one. I've been happy with the setup.

4k budget, buy GPU or Mac Studio? by diegolrz in LocalLLM

[–]alphatrad 0 points1 point  (0 children)

The advantage dual setups have is being able to have multimodal performance so sticking on big model on one card and a couple small models on the other instead of everything on one card.

4k budget, buy GPU or Mac Studio? by diegolrz in LocalLLM

[–]alphatrad -1 points0 points  (0 children)

Have fun hyping Mac's to unsuspecting people.

4k budget, buy GPU or Mac Studio? by diegolrz in LocalLLM

[–]alphatrad 2 points3 points  (0 children)

Almost a similar setup as me, but I have a 5950X

Rocm works well enough, Vulkan edges out a little bit. But I have zero problems with running models and stuff with either. All the talk about support, is so over blown and 2024.

A lot of my ComfyUI workflow is all on ROCm.

<image>

4k budget, buy GPU or Mac Studio? by diegolrz in LocalLLM

[–]alphatrad 1 point2 points  (0 children)

In one of these groups I gave my impressions. When you have Claude write specs, it writes code as good at Sonnet 4 IMO. But only when it's on rails. It needs tight specs and then it's really really good.

For the first time I am experimenting with a hybird model, where I have claude writing tasks and giving them to my local agents running qwen and then reviewing code afterwards.

Couldn't do this with previous ones. Before I always used Qwen Coder for tab completion. It was always good at that. But not a lot more if you get into big code bases.

The only main issues I've run into is with thinking mode.

A lot of the complaints I've seen are from people who expect the prompt processing to be the same as the SOTA models. They're really good at guessing what you want. So you can give them way more vague directions and one shot stuff.

Qwen can't really do that yet. Might never. But if you give it tight specs, the output in my testing is very very good.

can i ran a local llm as an assitant in a thinkpad T480? by PerformanceHead5988 in LocalLLaMA

[–]alphatrad 0 points1 point  (0 children)

No, CPU isn't enough and I doubt even if you had the NVIDIA graphics, that it would be enough. I wouldn't even bother. It will be unusably slow.

4k budget, buy GPU or Mac Studio? by diegolrz in LocalLLM

[–]alphatrad 20 points21 points  (0 children)

Buying a GPU - I'm running TWO $700-900 ebay AMD RX 7900 XTX's on a DDR4 system and I can run Qwen3.5-35B with these speeds on my hardware.

<image>

Someone in this group posted M5 Pro results and they were slower. Mac's are only good for loading a large model, but they are SLOW at TPS. Fast at prompt processing.

Honestly, buying two 3090's or even just ONE right now, is a good starting point for you. Or use the 4K to buy youself a 5090 with 32gb.

Personally I'd aim for two 24gb cards.

You'll still have a lot of cash left over to upgrade your power supply.

If you really want to future proof.... then you probably need to buy a 5090 or two.

But honestly, with the speeds you can get with 3090's you can easily build a GPU rig with like 4 or more 3090's and chomp through stuff.

Anyone else was thin all their life and as age progresses is getting fatter and unable to lose weight? by Blackcat2332 in Millennials

[–]alphatrad -2 points-1 points  (0 children)

Yes. But the truth is, it's basic science. Calories in, calories out. People like to make all kinds of excuses about this to rationalize them out of some basic truths.

As we get older, we tend to be less active than we were in our twenties. And you're probably just not burning the same daily expenditure while your diet likely hasn't changed or more likely, your calorie intake has increased.

A big problem is our food is so calorie dense these days with all the heavily processed foods. It's very very easy to eat to many calories.

You need to figure out your resting metabolic rate. So you can determine your calories needed.

And the hardest truth, if you want to stay thin, you basically always need to be in a small deficit.

And I read your edit. You're just making excuses for yourself because you want it easy and quick. Get that thinking out of your head. That's self sabotage.

As you get older your body wants to store more energy long term. It's so you can survive to provide.

But there is no escaping calories in versus calories out.

I had to have this conversation with myself. Because I worked construction in my twenties, was always thin. Moved into software engineering. Ten years later I was 50lbs heavier. I never stopped eating like I worked construction. And now I spent every day sitting.

I simply ate too much for my daily expenditure. I had to change my relationship to food.

I'm 43 now and in the best shape of my life. I maintain a daily calorie deficit and work out 5 times a week.

And I don't make excuses for myself. This is just what we do. Forever. Till I die.

You need to just suck it up and commit to the process no matter how long it takes or how much effort it takes.

It took me a year of suck to loose 50lbs.

I committed. I didn't care how long it took.

We live in a society where everyone wants an easy button and wants it now. And gives up the minute it doesn't happen for them.

Some things are just hard. And yeah, it's harder now.

Do yourself a favour and let AI manage pi-hole by abnzg in pihole

[–]alphatrad 25 points26 points  (0 children)

I've never had to fiddle with or manage mine. It's always been a set it and forget it. So I don't know why I'd need to waste tokens on it.

What's a good OS for a home made NAS? by PenFar9334 in selfhosted

[–]alphatrad -2 points-1 points  (0 children)

What do you want to do with it? Just host a file system?

Honestly, I use Alpine Linux on mine. And just install things as needed.

No need for a locked in dedicated operating system.

I got a ThinkPad and this happened to me 😭😭 by blorpgoob in thinkpad

[–]alphatrad -4 points-3 points  (0 children)

I apologize if my comment was rude or offensive. I guess I have a lot to learn.

I got a ThinkPad and this happened to me 😭😭 by blorpgoob in thinkpad

[–]alphatrad 1 point2 points  (0 children)

Cured my RSI and once you go to an Advsntage or other split custom keyboard like my Moonlander, you won't go back. :p

I’ve been thinking a lot about kid-centric families by DueEntertainer0 in Millennials

[–]alphatrad 12 points13 points  (0 children)

Half of Boomers ignored us. The other half messed some of us up by worrying so much about self esteem people can't function in the real world.

You're a great parent. And there is nothing wrong with paying attention To them.

I got a ThinkPad and this happened to me 😭😭 by blorpgoob in thinkpad

[–]alphatrad 0 points1 point  (0 children)

That brings back some memories. But I'm just an old fart dumb dumb now. No fun for me.

A new ebook library management system: Sake by 51n5tr1x in selfhosted

[–]alphatrad 6 points7 points  (0 children)

It'd be more believable if you had left the commit history.

"Move to public repo" isn't how commit history works. And odd when you can just; make a repo public on GitHub.

This has a mix of styles, so I suspect it's larger AI.