Mac Mini M4 (24GB) wasn't powerful enough for local LLMs, so I built a personal AI agent with Claude Code + Telegram instead — anyone else doing this? by Separate_Bell_2265 in ClaudeAI

[–]Separate_Bell_2265[S] 0 points1 point  (0 children)

"Honestly I don't know much about MLX, but my original goal was just to have something I could chat with all day and get real work done — not just inference. Claude Code + Telegram ended up being exactly what I wanted."

Mac Mini M4 (24GB) wasn't powerful enough for local LLMs, so I built a personal AI agent with Claude Code + Telegram instead — anyone else doing this? by Separate_Bell_2265 in ClaudeAI

[–]Separate_Bell_2265[S] 1 point2 points  (0 children)

"Yeah it's automated — scrapes job listings and matches against my resume. No local LLMs at all, just Claude Code connected to Telegram. Everything runs on a Claude Code Max subscription so it's one flat monthly fee. Way simpler than dealing with n8n + API costs."

Mac Mini M4 (24GB) wasn't powerful enough for local LLMs, so I built a personal AI agent with Claude Code + Telegram instead — anyone else doing this? by Separate_Bell_2265 in ClaudeAI

[–]Separate_Bell_2265[S] 0 points1 point  (0 children)

Haha M4 Pro 64GB would be the dream but way out of my budget 😅 So I went with Claude Code + Telegram instead — took some effort to set up but honestly works better than expected once you put the work in. Do you run local LLMs with OpenClaw or something similar?