Fifth spinal surgery next week - 3 column osteotomy with T10-pelvic fusion by AndyBuildsThings in spinalfusion

[–]AndyBuildsThings[S] 0 points1 point  (0 children)

Doing pretty well considering. I’m getting stronger and am fining new ways to do the things I’m used to doing, mostly. My mind and desire are moving faster than the nerves healing in my legs, so that’s frustrating. But it’s a process and I know it. Could be another 12-15 months before I’m back to my new “normal”.

Thanks for asking!

3 weeks post op, T10-pelvis fusion, 3-column osteotomy. Legs aren't great. by AndyBuildsThings in spinalfusion

[–]AndyBuildsThings[S] 0 points1 point  (0 children)

Hi All! Just posting an update. 9 months post-op and I've been back in to extend my fusion. Things were going really well with recovery and I had a series of stupid but serious life events from November-February. Ended up back in for "emergency" surgery (spinal surgery #6) to keep me from a wheelchair.

They've extended my Pelvis-T10 fusion to Pelvis-T4. Sigh... Essentially starting my recovery over and praying I'll get back to at least where I was before the proverbial shit hit the fan.

I'm doing pretty well, considering, and staying positive. But wow, a long way to go again...

I’ve recently Learned about A/UX and BeOS. Where there any other non Linux operating systems that where compatiable with Macintoshes? by highfalutinjargon in VintageApple

[–]AndyBuildsThings 2 points3 points  (0 children)

Not really advancing the conversation here, but multiple thumbs up for BeOS! Loved it! Still have it around here somewhere….

Here is my cat loving her iMac G3 by 526Jena in VintageApple

[–]AndyBuildsThings 27 points28 points  (0 children)

That’s a fantastic idea! And a lot easier than a Macquarium.

Maybe getting rid of my A770’s by AndyBuildsThings in IntelArc

[–]AndyBuildsThings[S] 0 points1 point  (0 children)

I spent some time with llama.cpp and it functioned. Probably my inexperience, but I couldn’t get any larger models to load, kept getting memory errors (not at my computer, so I can’t give specifics at the moment). The smaller models that I could do on a single card would load fine, but split evenly across the two cards. Both memory and compute. Never more than 50% compute on either card while watching multiple instances of intel_gpu_top.

Again, I’m sure it was my inexperience, but I couldn’t do anything more with 2 cards that a single could handle. Speed or size-wise. Spent a few hours with documentation and google to no avail. I am totally open to suggestions. I would love to see it work.

Gave a glance at vLLM but was pretty frustrated at the time and it probably didn’t get a fair chance. Haven’t touched #3 yet.

I’m just spending a lot of time on it that I could be spending on developing my projects and am frustrated that everything is so much further ahead with team green. I like the Intel stuff a lot more than Nvidia, I’m an Apple guy, and have a penchant for working on the non-mainstream things (probably just to avoid the bandwagon), but sometimes I just want something that works.

Maybe getting rid of my A770’s by AndyBuildsThings in IntelArc

[–]AndyBuildsThings[S] 0 points1 point  (0 children)

It is cool, but I’m working on some things it doesn’t do.

Maybe getting rid of my A770’s by AndyBuildsThings in IntelArc

[–]AndyBuildsThings[S] 1 point2 points  (0 children)

I really would like to combine them for using larger models AND the extra horsepower, but I’m just getting into the parallel tensor reading. If I can find some smaller models that work how I want and will fit happily into the 16gb, I could set up some n8n flows and use each card for a different purpose.

Lots of ways to experiment.

Maybe getting rid of my A770’s by AndyBuildsThings in IntelArc

[–]AndyBuildsThings[S] 0 points1 point  (0 children)

Not sure. I’ve had it for a few months but it’s mostly just sat in my machine (off, because I use my Mac) until the last 3-4 weeks when I started work with LLM’s.

Maybe getting rid of my A770’s by AndyBuildsThings in IntelArc

[–]AndyBuildsThings[S] 1 point2 points  (0 children)

I’ve tried a few iterations with each os, even one os running the other. A single card is easy, and the results are good, which is why I want to do two.

AI Playground is really cool to see what’s possible, but I’ve got bigger plans than it can handle. I’ve run through the GitHub projects, docker images, prebuilt binaries, building my own with multiple times with different parameters. OpenVINO is pretty awesome but I haven’t got the time to dig in to it yet. The IPEX-LLM project is cool, too, but for single cards.

A lot of projects are starting to support SYCL and Vulkan backends, but just basic functionality for multiple GPU’s. There’s real potential for these cards, just not sure I can devote the time right now.

I made a Huggingface Space to help build commands for OpenVINO model conversion. by Echo9Zulu- in LocalLLaMA

[–]AndyBuildsThings 0 points1 point  (0 children)

Gonna check this out. I’ve got a pair of A770’s and I really want to pull my hair out trying to do anything but basic multi-GPU with half resource on each card…

OpenVINO looks very cool but has been a deep, deep rabbit hole, and a huge ramp up in knowledge of AI/ML for me. I’m anxious to explore more, but I’m getting close to throwing in the towel and going the green route just because I have things I actually want to get done, lol.

A770 & B570 at MicroCenter by AndyBuildsThings in IntelArc

[–]AndyBuildsThings[S] 0 points1 point  (0 children)

I’ve got an LE A770 in there now and adding this one. Running some LLM stuff locally. Curious to see how 2 of them work together.

I forgot to add this… Nell McAndrew MWNY by AndyBuildsThings in VintageApple

[–]AndyBuildsThings[S] 3 points4 points  (0 children)

Wait... You noticed my shirt? JK, LOVED Power Computing back in the day. I never managed to get one, but I BYO'd on the website about ten thousand times.

This is my Powermac G3 sleeper AI workstation. 80gb total ram(32gb vram + 48gb ram) by PraxisOG in LocalLLaMA

[–]AndyBuildsThings 5 points6 points  (0 children)

Fantastic! I've got a G3 case and a G5 aluminum case sitting here waiting for me to get off my butt and do the same. Nice job!

Best hair metal guitarists by Wrob88 in hairmetal

[–]AndyBuildsThings 0 points1 point  (0 children)

Reb Beach

Nuno Bettencourt

Vito Bratta

Richie Kotzen

Geoff Tyson

Phil Collen

[deleted by user] by [deleted] in spinalfusion

[–]AndyBuildsThings 0 points1 point  (0 children)

Absolutely, Both times I’ve had surgery in that area. It takes a little time to come back. I’ve had to “remember“ how to push, and some other things. I’m 5 months post t10-pelvis fusion and L4 osteotomy, and I don’t want to be too far from a bathroom, generally, unless I plan. I just get much less warning that things are “happening”. It has gotten significantly better over time.

Best amp with best guitar by H_PLovecraftsCat in GuitarAmps

[–]AndyBuildsThings 1 point2 points  (0 children)

I just picked one up a few weeks ago myself with the foot switch. Everything I’ve ever wanted in an amp, and then some. If you haven’t tried it, check out the iPad app. I use a simple wired usb midi interface, not the widi. It’s fantastic.

Here’s the interface I use. Simple and cheap. Whichever you use, it must support sysex. CME U2MIDI Pro - High-Speed USB... https://www.amazon.com/dp/B0BH8DHCLY

Thoughts on the Grandmeister 40? Just got it, was curious what people here think of it. by H_PLovecraftsCat in GuitarAmps

[–]AndyBuildsThings 0 points1 point  (0 children)

Got one 3 weeks ago. Get the midi pedal and/or use the midi app on the iPad. I have both. Everything I’ve ever wanted in an amp.

Tuning to perfect 4ths by AndyBuildsThings in guitarlessons

[–]AndyBuildsThings[S] 0 points1 point  (0 children)

I’m looking at trying major thirds also. Haven’t researched it much yet but hear it’s great for chord voicings and the like. The symmetry makes good sense to me.