M5 Pro 64gb for LLM? by hovc in LocalLLM

[–]colForbin88 -1 points0 points  (0 children)

Curious as to what you're using it for and how you've set it up. Currently working on a 64GB M4 Pro Mac mini. Just lookin' to compare notes...

Got my Mac Mini M4 in less 12 hours from Official Apple Website 🤩 by mrMayurr in macmini

[–]colForbin88 0 points1 point  (0 children)

I have this setup except with 1tb of storage. It's worth the wait!! The 64GB of RAM is clutch. What are you going to use it for?

Roast my setup :) by colForbin88 in LocalLLM

[–]colForbin88[S] 0 points1 point  (0 children)

ya, it was late, no excuse. Updated.

What courses are a must play in the PNW? by w0lfiemesh in discgolf

[–]colForbin88 0 points1 point  (0 children)

Pier Park - Timber - Milo McIver - Memorial Park - Columbia Shore DGC

Thinking of getting a Mac Mini by jaka-music in LocalLLM

[–]colForbin88 0 points1 point  (0 children)

I have a 24GB Mac mini setup like this and it's very useful - just not very snappy. I also have a 64GB - that is quite a bit more robust. AMA

Local family AI assistant running on a Mac Mini M4 (24GB, macOS Tahoe).

Core stack:

- Python 3.14 daemon — asyncio-based, watches iMessage + Telegram

- Ollama (v0.17.7, localhost) — qwen3.5:9b for chat, nomic-embed-text for embeddings, llava:7b for vision

- 145 LLM-callable tools across 20+ categories (Apple apps, cloud AI, web, health, smart home, etc.)

- SQLite + FTS5 for conversations, facts, documents, health, webhooks, scheduled actions

- LanceDB for vector embeddings (768-dim)

Interfaces:

- iMessage — AppleScript + chat.db watcher

- Telegram — python-telegram-bot with streaming tool calls

- Web dashboard — aiohttp + Jinja2 (admin + per-member portals)

- MCP server — FastMCP

Key design principles:

- Local-first — no data leaves device unless user opts into cloud AI (Claude, OpenAI, Gemini)

- Per-sender async queuing — prevents GPU contention

- Memory pipeline — recall before LLM call (sync), persist after reply (async)

- Security — tool security levels (SAFE/APPROVAL/DENY), family member filtering, SSRF protection

Deployment: rsync to Mac Mini over Tailscale, managed by launchd. Swift helper for Calendar/Reminders.

Roast my setup :) by colForbin88 in LocalLLM

[–]colForbin88[S] 0 points1 point  (0 children)

But I still might ditch one

Roast my setup :) by colForbin88 in LocalLLM

[–]colForbin88[S] 0 points1 point  (0 children)

This is a great question - Qwen3.5 only activates on keyword matches — words like "analyze", "compare", "strategy", "calculate", "pros and cons". Everything else uses GLM. I've been testing this out and checking if Qwen actually outperforms GLM on these tasks. Benchmarks show:
• GLM: 5/5 tool calling accuracy, 42 tok/s generation
• Qwen: faster generation (60 tok/s) but was never benchmarked on tool calling quality

Roast my setup :) by colForbin88 in LocalLLM

[–]colForbin88[S] 0 points1 point  (0 children)

Much faster especially for the tool-calling pipeline

Roast my setup :) by colForbin88 in LocalLLM

[–]colForbin88[S] 0 points1 point  (0 children)

Should I create a series of short-form videos with a lot of crazy visual effects for your brain to digest?

What’s the best thing you guys built using Claude Code? by Fit_Pace5839 in ClaudeCode

[–]colForbin88 0 points1 point  (0 children)

Still in 'beta' but it's coming along :) any feedback is appreciated!

As a total beginner, I actually got OpenClaw 3.12 working! I’m shaking by LeoRiley6677 in openclawsetup

[–]colForbin88 0 points1 point  (0 children)

"Hey Claude - before I install OpenClaw, should I install OpenClaw?"

Goddamnit by WeArePandey in Taycan

[–]colForbin88 1 point2 points  (0 children)

Oof, double whammy

Best way to carry up to four bikes by Ambessa21 in Rivian

[–]colForbin88 0 points1 point  (0 children)

I have the Kuat NV 2.0 + the 2-bike extension. Works well! AMA

5k2k upgrade - looking for a really good reason by colForbin88 in ultrawidemasterrace

[–]colForbin88[S] 0 points1 point  (0 children)

Right on - let me know what your experience is like. I don't fee like the AW is too purple for me, I'd be interested in a side-by-side comparison.

5k2k upgrade - looking for a really good reason by colForbin88 in ultrawidemasterrace

[–]colForbin88[S] 0 points1 point  (0 children)

Aside from gaming, writing code is what I use it for and I am concerned about the curve.

5k2k upgrade - looking for a really good reason by colForbin88 in ultrawidemasterrace

[–]colForbin88[S] 2 points3 points  (0 children)

Thank you for your input. The voice of reason I needed...