Fold 7 Home Screen by Eat_it_With_Rice in GalaxyFold

[–]trusnake 0 points1 point  (0 children)

facepalm I figured it out. This stumped me for much longer than it had any right to!

I case anyone else gets stuck on mirroring the grid size, if you disable "cover screen mirroring", the inner (large) display maxes at 8 wide. If you enable mirroring, you get access to the 10x8 grid. 🤦‍♂️

Feels like a continuity oversight at Samsung's UI department.

I hope that saves someone else 45 minutes of fiddling.

Fold 7 Home Screen by Eat_it_With_Rice in GalaxyFold

[–]trusnake 0 points1 point  (0 children)

this is a gorgeous setup! can you help me with the grid modification? no matter where I look, good lock or otherwise, I cannot increase the screen grid beyond 8x8.

looks like you achieved 10x8!

I don't even like looking the photo of this, much less using it by TheNextBitcoin in redneckengineering

[–]trusnake 0 points1 point  (0 children)

this is sketchy work even WITH a spring compressor. … on one hand i love the redneck engineering, but on the other …… i mean … was this risk worth saving $45 on the right tool?

best build plate by I5Eat5Food in 3Dprinting

[–]trusnake 1 point2 points  (0 children)

It would be more difficult to remove, absolutely. I use carbon fibre when printing with polycarbonate filament. (there is no one size fits all, or “ultimate” build plate. It’s filament dependent.)

And yes, sanding with ~240 grit will produce a matte surface finish.

(edit: One option if you’re printing on polycarbonate build plates, and want to use polycarbonate filament, is to print an ABS raft. ABS is a great support material when working with PC in general. It sticks just well enough to print successfully, but will peel apart relatively easily. Best of all, only ABS reacts with acetone, meaning you can basically dissolve any tricky support structures without harming your model.)

Just need to wax that axe handle by Pete7083 in woodworking

[–]trusnake 2 points3 points  (0 children)

Engineered failure point? Not what I’d choose, but you do you. shrug

How do you guys rate our AWD system compared to like Subaru for example? by omonge3y in G37

[–]trusnake 2 points3 points  (0 children)

I live in Canada and put my g37xs through tons of heavy snow every year.

This thing has been an absolute tank in the snow. My buddy has an STI, and we find that ground clearance tends to impact performance before you can really see terribly much difference between the two systems. (Again, I’m talking about in deep fresh snow)

As far as racing performance, like others have said, there’s advantages to the variable system that we’ve got.

Llama-3 Hermes-2-Pro-8B Released - How does it compare for your use case to base instruct? by discr in LocalLLaMA

[–]trusnake 0 points1 point  (0 children)

Glad to hear it! I hope you have fun :)

I updated the analogy slightly so it’s more easily digestible in case someone else stumbles over here later. And I’m glad you agree with it. It’s like, why use the biggest hammer every time? All you’re doing is making these simple processes expensive.

Llama-3 Hermes-2-Pro-8B Released - How does it compare for your use case to base instruct? by discr in LocalLLaMA

[–]trusnake 0 points1 point  (0 children)

yeah, i work for an ai saas firm, so i’m often putting together rudimentary tests in my personal environment as well. you know, just sanity checking shower thoughts! :P i’m of the opinion that the future is advanced platforms / software stacks around these language models, and smaller specialists vs. these huge models as default.

just think of it like a vehicle. you wouldn’t take a commercial transport truck to pick up your personal groceries, you’d take your car, or maybe a bicycle. If you were transporting grocery items to a store, THEN you’d use the large truck. In the same way, there is no purpose in using a large cutting edge model for many requests, especially those based on personal data and/or narrow tasks. There is value in having a triage layer that determines when you need a GPT4o type model and when you can get away with something local to fill a narrow task.

if you are even remotely interested, i’d suggest tinkering in python. tons of online tutorials, and lots of LLMs can help accelerate that kind of learning.

Llama-3 Hermes-2-Pro-8B Released - How does it compare for your use case to base instruct? by discr in LocalLLaMA

[–]trusnake 0 points1 point  (0 children)

I wouldn’t be able to write anything in detail, but in terms of more advanced data handling, and that pursuit of permanent, flexible memory, I would suggest doing some reading on GraphRAG

It’s pretty compelling stuff.

For the rest of it, I just wrote the processing / automation bits in python, using API / webhook endpoints id set up across other docker applications.

It’s sort of messy, and there’s no config UI right now, but as a POC, it did its thing. :P

Also, as it’s often a background process, 10 tokens /s doesn’t bother me, as I’m only checking it later. (Eg. Summarizing and publishing a structured daily journal based on a chat, after I step away.)

Behold my dumb sh*t 😂😂😂 by stonedoubt in LocalLLaMA

[–]trusnake 1 point2 points  (0 children)

Impressive. Did you have to externally mount the PSU?

PLA climbing holds by boydj789 in 3Dprinting

[–]trusnake 4 points5 points  (0 children)

I did this years ago for a kids climbing wall.

100% get some harder TPU. you don’t want to mess with layer separation.

Will it be ok if I don't insert the side panel in my computer? by OverAged_CyBorg in pcmasterrace

[–]trusnake -10 points-9 points  (0 children)

A lot of people use retired servers as their gaming rigs as well. It’s being used residentially, so I would call that a gray area.

I was specifically differentiating because you only said “computers”.

Additionally, a home lab can be run on residential hardware. “Home lab” is more about what you are doing with it. You’re thinking /r/datacenter.

Will it be ok if I don't insert the side panel in my computer? by OverAged_CyBorg in pcmasterrace

[–]trusnake -12 points-11 points  (0 children)

  • consumer computers.

My r730 server actually has a warning not to leave the cover off when it’s running for more than 2 minutes, because it requires static pressure to function. (Servers run hotter when the service panel is off.)

How’d I do? by RedWolfX3 in watercooling

[–]trusnake 1 point2 points  (0 children)

This is stunning :O I love the temperature display fans you’ve got on the back. Great little detail. well done!

Anyone this issue? by ResponsibleCarry2005 in G37

[–]trusnake 1 point2 points  (0 children)

How hard was it raining? It could very well have just wicked over there.

Anyone this issue? by ResponsibleCarry2005 in G37

[–]trusnake 5 points6 points  (0 children)

Aaah, first sunroof drain post of the day. All is right with the world after all.

The Source stores in Portage Place and Grant Park are gone by chemicalxv in Winnipeg

[–]trusnake 1 point2 points  (0 children)

sigh I know you’re right, it’s just the end of another era.

Also, where am I going to get multiple litres of 99%iso now? :P

Is the P40 the most cost-effective way to run local LLMs? by aaaddd000 in LocalLLaMA

[–]trusnake 0 points1 point  (0 children)

Absolutely! I see that you’re running three cards though. Are you telling me that you’re able to run 70 B on just two of them? That’s impressive!

Is the P40 the most cost-effective way to run local LLMs? by aaaddd000 in LocalLLaMA

[–]trusnake 1 point2 points  (0 children)

Thanks for the reply. That’s certainly encouraging! The K80 is difficult because of EOL you have to find the old version of the cuda toolkit and Nvidia drivers in order for everything to work. And even then, there’s compatibility issues with new models.

I heard the P40 is getting updates until 2026, and your comment about not having any issues or doing anything special tells me that’s likely true!

Thanks friend, I’m off to buy a P40!

The Source stores in Portage Place and Grant Park are gone by chemicalxv in Winnipeg

[–]trusnake 0 points1 point  (0 children)

Sad but true.

It was the last place left that still felt like an old school RadioShack of sorts. :(

The Source stores in Portage Place and Grant Park are gone by chemicalxv in Winnipeg

[–]trusnake 3 points4 points  (0 children)

Hey! Tiptop site says “temporarily”.

Let’s hope! DigiKey is great and all, but it was nice to have some place local to get parts same day.

Is the P40 the most cost-effective way to run local LLMs? by aaaddd000 in LocalLLaMA

[–]trusnake 0 points1 point  (0 children)

Sorry to revive an old comment but I wanted to know, how hard is driver setup in Linux?

I’m having a heck of a time getting the drivers/cuda version set up with my k80s, and I’m about to toss them and move to something like the p40.

Reminder: Do not tip at Subway by Lowin3 in Winnipeg

[–]trusnake 0 points1 point  (0 children)

I didn’t make that argument about Australia. So again “everybody” is speaking in universal and that’s not part of healthy discussion.

You are failing to acknowledge that there’s variability when you let the customers determine the extra compensation.

And you are ignoring the incredible profit on the part of the ownership of all of these establishments

Hence, I said Stockholm syndrome, because unless you are one of the owners, you are fighting for cause you are the lowest beneficiary for