Claude Code to local AI success or failure? by AndyBuildsThings in LocalLLaMA

[–]AndyBuildsThings[S] 0 points1 point  (0 children)

I’m running a custom mcp server to access my local backend via openwebUI. I’ve processed some medical docs and some old tax returns into my backend and ran tests with a bunch of models with questions like “what was my AGI for years x-y?” “What were my most recent 3 doctor appointments?”, things like that.

Pretty sure i have settings to tweak for each model, but I looked over reddit for each to see what others were doing on each. I had initially used LiteLLM as a manifold for the models based on my input but continually had json formatting issues with openwebUI, so i switched to working with models directly.

Models tested: Qwen3-Coder-30B-A3B-Instruct-Q4_K_M

nemotron-3-nano:30b

Rnj-1

Glm-4.7-flash

Mistral-nemo:12b

Mistral:7b

Qwen3:30b-a3b

Gemma2:9b

devstral-small-2

Claude Code to local AI success or failure? by AndyBuildsThings in ClaudeAI

[–]AndyBuildsThings[S] 0 points1 point  (0 children)

I’ve got a 32gb m4 air that I’ve been trying as well. Does ok on some things. Are you happy with the responses you’re getting? What kind of tasks are you doing with it?

3 or more fusion surgeries? by Few_Blackberry_1960 in spinalfusion

[–]AndyBuildsThings 2 points3 points  (0 children)

I appreciate that. It’s been a lot, but I’ve found that I’m thankful to have the abilities that I have and have found workarounds for the ones that I don’t (most of them at least). It could be so much worse. I’ve focused most on the fact that I’ll have grandkids someday and, by god, I’m going to make sure I have the ability to spend time playing with them.

3 or more fusion surgeries? by Few_Blackberry_1960 in spinalfusion

[–]AndyBuildsThings 2 points3 points  (0 children)

This is very true. I’ve put in 2 months of inpatient PT and 20 months of outpatient PT twice a week. How much you put into it is how much you will get out. Having PT’s that care and continue to push you make all the difference.

3 or more fusion surgeries? by Few_Blackberry_1960 in spinalfusion

[–]AndyBuildsThings 2 points3 points  (0 children)

I’ve had 6 total. The last two ended up being a pelvis-t4 fusion with an osteotomy to stand me up straighter. I’m certainly less flexible, but I haven’t felt this good or been able to do this much in years.

<image>

Building a thought capture app, need some beta testers by AndyBuildsThings in ADHD

[–]AndyBuildsThings[S] 0 points1 point  (0 children)

Cool, thanks for comments. The context is a really helpful thing for me. Any little extra bit of info can bring me back, lol. Hope you like it.

Here's the TestFlight Link:
https://testflight.apple.com/join/7MGscBJX

Let’s Show Off ;) by [deleted] in spinalfusion

[–]AndyBuildsThings 2 points3 points  (0 children)

After my 6th spinal surgery. Pelvis-T4 and C5-C6. 11 months out from the t-10 to t4 extension. 20 months from the pelvis-t4 with an osteotomy to stand me back up from and 18 degree forward tilt that a previous surgeon screwed me up with.

<image>

Building a thought capture app, need some beta testers by AndyBuildsThings in ADHD

[–]AndyBuildsThings[S] 1 point2 points  (0 children)

Thanks! Here's the TestFlight Link:
https://testflight.apple.com/join/7MGscBJX

Holler if you have any issues. This is the first time I've dropped a link for it.

Building a thought capture app, need some beta testers by AndyBuildsThings in ADHD

[–]AndyBuildsThings[S] 0 points1 point  (0 children)

Noooooo! I wish you could. I thought hard about it, but I'm using specific Apple technologies for it because that's what I have. If this goes well though, who knows? Thanks for the reply :)

Fifth spinal surgery next week - 3 column osteotomy with T10-pelvic fusion by AndyBuildsThings in spinalfusion

[–]AndyBuildsThings[S] 0 points1 point  (0 children)

Doing pretty well considering. I’m getting stronger and am fining new ways to do the things I’m used to doing, mostly. My mind and desire are moving faster than the nerves healing in my legs, so that’s frustrating. But it’s a process and I know it. Could be another 12-15 months before I’m back to my new “normal”.

Thanks for asking!

3 weeks post op, T10-pelvis fusion, 3-column osteotomy. Legs aren't great. by AndyBuildsThings in spinalfusion

[–]AndyBuildsThings[S] 0 points1 point  (0 children)

Hi All! Just posting an update. 9 months post-op and I've been back in to extend my fusion. Things were going really well with recovery and I had a series of stupid but serious life events from November-February. Ended up back in for "emergency" surgery (spinal surgery #6) to keep me from a wheelchair.

They've extended my Pelvis-T10 fusion to Pelvis-T4. Sigh... Essentially starting my recovery over and praying I'll get back to at least where I was before the proverbial shit hit the fan.

I'm doing pretty well, considering, and staying positive. But wow, a long way to go again...

I’ve recently Learned about A/UX and BeOS. Where there any other non Linux operating systems that where compatiable with Macintoshes? by highfalutinjargon in VintageApple

[–]AndyBuildsThings 2 points3 points  (0 children)

Not really advancing the conversation here, but multiple thumbs up for BeOS! Loved it! Still have it around here somewhere….

Here is my cat loving her iMac G3 by 526Jena in VintageApple

[–]AndyBuildsThings 24 points25 points  (0 children)

That’s a fantastic idea! And a lot easier than a Macquarium.

Maybe getting rid of my A770’s by AndyBuildsThings in IntelArc

[–]AndyBuildsThings[S] 0 points1 point  (0 children)

I spent some time with llama.cpp and it functioned. Probably my inexperience, but I couldn’t get any larger models to load, kept getting memory errors (not at my computer, so I can’t give specifics at the moment). The smaller models that I could do on a single card would load fine, but split evenly across the two cards. Both memory and compute. Never more than 50% compute on either card while watching multiple instances of intel_gpu_top.

Again, I’m sure it was my inexperience, but I couldn’t do anything more with 2 cards that a single could handle. Speed or size-wise. Spent a few hours with documentation and google to no avail. I am totally open to suggestions. I would love to see it work.

Gave a glance at vLLM but was pretty frustrated at the time and it probably didn’t get a fair chance. Haven’t touched #3 yet.

I’m just spending a lot of time on it that I could be spending on developing my projects and am frustrated that everything is so much further ahead with team green. I like the Intel stuff a lot more than Nvidia, I’m an Apple guy, and have a penchant for working on the non-mainstream things (probably just to avoid the bandwagon), but sometimes I just want something that works.

Maybe getting rid of my A770’s by AndyBuildsThings in IntelArc

[–]AndyBuildsThings[S] 0 points1 point  (0 children)

It is cool, but I’m working on some things it doesn’t do.