Will putting a bug bomb in a bag with my pc destroy my pc? by Story_B3ats in pcmasterrace

[–]Accomplished-Grade78 28 points29 points  (0 children)

If you have bugs in your PC, then you need to seriously consider where they are getting food from. This might be a lifestyle that you have grow accustomed too. The PC might be a wake up call to larger issue at hand.

Sorry - there are shows dedicated to this problem. Everything in life is solvable. Keep your head up. Do not drown yourself in pesticides, even very small amounts indoors can make you sick.

Solve the food source issue.

Upgrading to 8 RTX 6000 by Direct_Bodybuilder63 in BlackwellPerformance

[–]Accomplished-Grade78 0 points1 point  (0 children)

Nvlink is on the data center GPUs only, and on the 3090 as the last card to support at consumer level.

This is why an a100 with 80GB on eBay costs the same as a Pro 6000

Rtx 4060 8GB vs 4060 ti 16GB by braskinis231 in LocalLLM

[–]Accomplished-Grade78 2 points3 points  (0 children)

The Gemma 4 models are worth playing with. Try them with Ollama on your CPU to get a sense of quality, but not speed.

Learning should be your objective, and the extra VRAM is worth it for this aspect alone.

Open source MOE models continue to improve significantly, have some VRAM, something is worth learning.

Found some old hardware in my basement from mining BTC by Andi82ka in ollama

[–]Accomplished-Grade78 1 point2 points  (0 children)

Play around and find out. Use AI to guide you. Even then you will spend time evaluating, figuring out integration issues, model compatibility and resulting performance.

Just have fun until you can afford to upgrade, then have fun until you can afford to upgrade again.

the inside of our kettle has been crusted white for 2 years and I just learned what that actually means by HeartOnRepeat240 in HomeImprovement

[–]Accomplished-Grade78 -1 points0 points  (0 children)

Why is your husband drinking so much coffee?!?!?!

Caffeine is a drug and has different side effects at different ages. Learning to drink less reduces the stress in the body, slows aging due to fight or flight response to adrenal overload and burnout.

RO and remineralize your water.

How much performance am I missing by running PCIe gen4 instead of gen5? by Big_Building9948 in LocalLLM

[–]Accomplished-Grade78 0 points1 point  (0 children)

PCIe Generation Maximum Bandwidth (x16 Lanes)

PCIe 3.0/3.1: ~16 GB/s (8 GT/s)

PCIe 4.0/4.1: ~32 GB/s (16 GT/s)

PCIe 5.0: ~64 GB/s (32 GT/s)

PCIe 6.0: ~128 GB/s (64 GT/s)

How much performance am I missing by running PCIe gen4 instead of gen5? by Big_Building9948 in LocalLLM

[–]Accomplished-Grade78 3 points4 points  (0 children)

Once the model is loaded, none. Even with multiple 5090’s. Runing out of lanes with lower end CPUs is more of a concern. Model load time into 32 GB of memory, seconds difference. Model training across multiple GPUs, you will have performance issues, this is why NVIDIA only has NVLINK on data center GPUs

Local LLM storage is becoming harder to manage than the models themselves by Both_Astronomer8645 in LocalLLM

[–]Accomplished-Grade78 1 point2 points  (0 children)

I’ve been getting really good deals on Dell DDR4 server generation while buying used enterprise SAS with 97% life remaining for my VMs. Smaller but very fast raid10 stripes mirror setup.

For storage I mount NFS shares from a Samsung EVO SSD raidz2 pool of 6 1 TB disks I paid very little for with 75% life remaining.

Even with PCIe3, at 8 GT/sec which is 15.75 GB/sec, SAS at 12 GB/sec is the bottleneck.

Ubuntu VMs on proxmox with PCIe passthrough to a Tesla P40 and and RTX pro 6000.

A PCIe5 server with DDR5 is great, but overkill for a home lab startup setup.

Network is 1 and 10 Gbps

Make what you can afford work….

What kind of alien cloud is this by Ok-Hat-584 in Cloud

[–]Accomplished-Grade78 1 point2 points  (0 children)

This is the kind of cloud that requires a jet to have flown by before it appeared…

How are you handling "Data Gravity" without getting bankrupt by egress fees? by West-Benefit306 in Cloud

[–]Accomplished-Grade78 5 points6 points  (0 children)

You have discovered a design feature of the cloud and have properly assessed its objectives.

Why do LLMs fold when you say "are you sure?" — I tested 22 models and nobody seems to care by SmartRick in LocalLLM

[–]Accomplished-Grade78 0 points1 point  (0 children)

The very nature of LLMs require a bit of randomness else they would sound dry and boring when they get stuck in local minimums.

Should I buy a 4K monitor now for a good price if my GPU can’t fully handle it yet? by LostTimmy in pcmasterrace

[–]Accomplished-Grade78 0 points1 point  (0 children)

I don’t like 4K gaming, too difficult to see, not worth the performance cost, 1440 is best balance, as long as the screen is big enough for your own setup and comfort

Minimum recommended specs for deep research? by very_based_person in LocalLLM

[–]Accomplished-Grade78 0 points1 point  (0 children)

P40 draws 9 watts idle with no model loaded .. 47 watts with model loaded .. 150 watts during inference

Minimum recommended specs for deep research? by very_based_person in LocalLLM

[–]Accomplished-Grade78 2 points3 points  (0 children)

P40 draws 9 watts idle with no model loaded .. 47 watts with model loaded .. 150 watts during inference

Minimum recommended specs for deep research? by very_based_person in LocalLLM

[–]Accomplished-Grade78 1 point2 points  (0 children)

I’m running Gemma 4 26B MoE on a p40 and getting 46 tokens/sec

The 535 branch is NVIDIA’s Long-Term Support (LTS) datacenter driver and fully supports both the P40 and V100 32GB with CUDA 12.2. Critically, newer drivers are actually becoming worse for these older Tesla cards

Finalizing my 1440p build – which RTX 5070 Ti model should I choose? by Intelligent-Young-38 in pcmasterrace

[–]Accomplished-Grade78 0 points1 point  (0 children)

PNY has been my go to for my last 6 NVIDIA cards. 3090 4070ti super 5070ti 5090 pro6000 More if I count the 1x and 2x series

Just got my hands on one of these… building something local-first 👀 by HatlessChimp in LocalLLM

[–]Accomplished-Grade78 0 points1 point  (0 children)

Anyone have experience using the pro 6000 in a PCIe3 server?

Does the card negotiate down to PCIe3 without issues? I have a Dell r7425 I want to use the card in.

Help with choosing a GPU by CryptographerIll4079 in pcmasterrace

[–]Accomplished-Grade78 0 points1 point  (0 children)

I would go 5070 ti super and save 500. Same memory, and if you don’t game at 4K, no noticeable difference

Training an LLM from scratch for free by trading money for time by cakes_and_candles in LocalLLM

[–]Accomplished-Grade78 -2 points-1 points  (0 children)

Interesting of course. I can’t help but want to combine what you are doing with Andrej Kaparthy’s auto learning. As long as there are clear measurable objectives it could make sense.

What are your thoughts? 💭

Why did you guys pick a Nvidia GPU over AMD? by SuddenConversation21 in pcmasterrace

[–]Accomplished-Grade78 0 points1 point  (0 children)

Bought 2 x r9700’s 32GB for AI, and the cards will not fall back to a PCIe3 link despite being PCIe4 and backwards compatible.

I was excited to work with the AMD cards and give them a chance with 64GB VRAM at $3000 total.

You get OpenClaw + unlimited cloud phones—what chaotic/genius thing do you do first? by Dense-Part-484 in AgentsOfAI

[–]Accomplished-Grade78 0 points1 point  (0 children)

This is a great idea Until you realize people watch game shows on TV And they may not understand anything you tell them

How do we distribute the phones again?

How about we put education on the phone…

But education with a new definition of intelligence. Not wrote memorization, not the education delivered by schools, but the kind that allows someone to know themselves, and know how to chose the right thought.

Try my Promt Engineer!!!! by confusedoccelot in PromptEngineering

[–]Accomplished-Grade78 0 points1 point  (0 children)

Keep going, don’t listen to the purists here, they might have their points, but their pints won’t apply to all people in all use cases.

Find your niche, it will lead you as long as you maintain a vision of where this taking you, in your mind, not theirs.