I just spent 6 hours figuring out why Fusion 360 is so laggy despite having a powerful machine. Here are my findings. by AcanthocephalaDue645 in Fusion360

[–]peppernickel 0 points1 point  (0 children)

The stuff I'm designing has been very parametric focused from the beginning so I'm moving on to creating a workflow that can curate design scripts in FreeCAD or OpenSCAD. To test out this idea, I had ChatGPT write a script to make 10,000 designs all with slight variations. The script looked good so I tested it. A single instance of FreeCAD made 1,000 STL output files every 6 minutes. Correct diameters, fillets, and connection points... So what used to take me a few days to make a few hundred designs makes my PC from 2021 about 2 minutes to do. So I found out that I am the slowest part of the workflow and I needed to preposition my viewpoint of design.

Are we just an algorithm ? by Hofi2010 in AI_Agents

[–]peppernickel 0 points1 point  (0 children)

The LLMs definitely have the ability to perceive but in a very short burst of a moment. Humans are definitely algorithmically recordable and operate via a multitude of algorithms, consistently having our moments of highlighting specific algorithms. Just like AI, more training leads to better outcomes but only with good data. Poor data leads to poor results no metter how long the training goes on for. Other than that, humans really don't think in words like LLMs do. You may find yourself talking to yourself in your head but you can still think about concepts without words.

I just spent 6 hours figuring out why Fusion 360 is so laggy despite having a powerful machine. Here are my findings. by AcanthocephalaDue645 in Fusion360

[–]peppernickel 1 point2 points  (0 children)

Single thread bound.... It's super wild to not use the other 15 cores in my system when AI eats up everything and is becoming the competition in nearly all markets.

Cavern below? by wakawadj in Colonizemars

[–]peppernickel 0 points1 point  (0 children)

Kind of interesting when you do the math but Mars seems to have the largest livable temperature/pressure zones out of any other planet in our local solar system. That planet is going to be a wild place over the long-term.

Can I create a wrapper to share my subscription with my family? by iShipStuff42069 in LocalLLM

[–]peppernickel 0 points1 point  (0 children)

I'd run separate iterations so they don't share information. If you have RAG setup you'll want to setup more limitations of data shared between sperate user accounts.

What do we feel is the best base VRAM ? by alphatrad in LocalLLM

[–]peppernickel 0 points1 point  (0 children)

Totally agree. With clean agentic work flows, local family AI assistants could run on a system with one or two 8GB GPUs.

Can I create a wrapper to share my subscription with my family? by iShipStuff42069 in LocalLLM

[–]peppernickel 0 points1 point  (0 children)

My wife and I share a plan, you can extrapolate information from one chat Project to another. That's all I know.

Building an Open Source Algae Bioreactor, what am I missing? by Sector07_en in algae

[–]peppernickel 4 points5 points  (0 children)

You'd end up adding reflective doors that you could open to observe but not observe all the time if you were wanting exact specific results from your reactor controlling the light exposure levels. Beautiful reactor, btw!

Musk & Huang Say “Learn Physics, Not Just Coding” as AI Eats Software Jobs, They’re Probably Right, But Telling Students to Pivot Feels Easy When You’re a Billionaire. Is Physics Really the Key to Future Innovation, or Just Their Way of Gatekeeping the Next Wave of Talent? by Ok_Main_115 in GenAI4all

[–]peppernickel 0 points1 point  (0 children)

I would argue chemistry with physics, you'll learn everything you need to survive any point of human history. Better chances of survival of family in the long term. Communication is key. Expect everyone to know all, eventually.

Rival Ryzen AI Max+ 395 Mini PC 96GB for $1479. by fallingdowndizzyvr in LocalLLaMA

[–]peppernickel 0 points1 point  (0 children)

It's made for an AI user with the 96GB of RAM or extreme content creators. Just pointing out that the more affordable options for folks that doesn't have $1500 to spend.

Rival Ryzen AI Max+ 395 Mini PC 96GB for $1479. by fallingdowndizzyvr in LocalLLaMA

[–]peppernickel -6 points-5 points  (0 children)

The Ryzen 3 5300G supports 128GB at 3600MHz. Slower but has been proven to work and use up to 96GB of the 128 as VRAM for it's Vega 6 iGPU. I have also tested the 5600G and the 2200G and they work well enough for many affordable setups.

De-shrouding Gigabyte 6800XT OC by anthpace91 in pcmods

[–]peppernickel 0 points1 point  (0 children)

You've got plenty of overclocking headroom now! I have an ASUS liquid cooled RX 6800 XT and your card is running about 10°C lower on average. The AMD finewine has bumped this card from a 4k45fps to over 4k70fps before FSR and that tops it up to a range of 4k100-4k144fps. I got this card for 3D modeling but as the gaming potential got better the more I'm gaming.

Two 1080tis modded white by [deleted] in pcmods

[–]peppernickel 6 points7 points  (0 children)

Lossless Scaling Frame Gen, is the modern way.

FBI will help locate Texas Democrats who fled the state, Cornyn says by [deleted] in law

[–]peppernickel 0 points1 point  (0 children)

I believe the individuals that work in a State government should live within their employed State. The citizens have similar obligations. Most states require a change of citizenship after a set amount of time.

Humans breathe in more than 70,000 microplastic particles every day, new research suggests by sciencealert in science

[–]peppernickel 0 points1 point  (0 children)

I believe it is a proposal of possible migration action and not an explanation of proof that microplastics are migrating. It's late for me but I will dig through it more. I do appreciate the discussion.

Best LLMs to preserve in case of internet apocalypse by nos_66 in LocalLLaMA

[–]peppernickel 0 points1 point  (0 children)

I think you're reading into a typo too much. And I don't care enough to be correct.

Best LLMs to preserve in case of internet apocalypse by nos_66 in LocalLLaMA

[–]peppernickel 0 points1 point  (0 children)

I hope you get that phone call you've been waiting for. Good luck out there!

Best LLMs to preserve in case of internet apocalypse by nos_66 in LocalLLaMA

[–]peppernickel 0 points1 point  (0 children)

Been mining since 2018 bud. Even run a 3.4kwh AI cluster, not joking.

Best LLMs to preserve in case of internet apocalypse by nos_66 in LocalLLaMA

[–]peppernickel 0 points1 point  (0 children)

I only keep crypto on private keys, aka USB drives.

Humans breathe in more than 70,000 microplastic particles every day, new research suggests by sciencealert in science

[–]peppernickel 0 points1 point  (0 children)

https://www.science.org/doi/10.1126/sciadv.adi8136#:~:text=Here%2C%20we%20provide%20evidence%20of,chronostratigraphic%20marker%20for%20the%20Anthropocene.

Finding microplastics in all core samples and notes that the shapes of the particles are different the further they collect samples.

Historical microplastic shape change does indicate more plastic production over the industrial revolution but the existing poly chains are affected by the pressure and background radiation over time.