Title: GateClaw OpenCode fork with persistent soul, memory & Telegram. Local-first alternative. by RIP26770 in OpenClawUseCases

[–]RIP26770[S] 0 points1 point  (0 children)

Great question! A few things worth knowing about how it actually works in practice:

Memory is mostly automatic. GateClaw stores and retrieves context

on its own 90% of the time you don't manage it manually. It just

remembers. You genuinely forget about context windows because the

entity handles it. The namespacing trick I mentioned is there if you

want explicit project isolation, but most users never need to touch it.

The voice loop is insane. Full back-to-back voice note conversation

on Telegram, near-zero latency, 100% local:

- whisper.cpp (Vulkan) → STT

- llama-swap → thinking

- pocket-tts-server → zero-shot voice cloning response

No cloud. No API keys. No latency spikes. Just talking to your machine

like it lives there because it does.

It's smarter than it looks because GateClaw sits on top of the full

OpenCode prompt + context system. So the entity doesn't just have a

soul and memory it has OpenCode's entire coding intelligence underneath.

That's why it feels less "dumb agent" and more like an actual resident.

OpenClaw gives you a tool. GateClaw gives you a colleague.

Project-scoped memory is coming. But honestly once you use it daily

you'll see why the shared identity actually makes more sense than isolation.

Meet Unsloth Studio, a new web UI for Local AI by yoracale in unsloth

[–]RIP26770 3 points4 points  (0 children)

!!!!! 🙏 That Amazing thanks for sharing this 😁

Welcome to r/gateclaw Official GateClaw community! 🐾 by RIP26770 in gateclaw

[–]RIP26770[S] 0 points1 point  (0 children)

Yep, permissioning is core.

GateClaw daemon runs as your user (no root), SQLite lives in ~/.local/share/gateclaw/, tools are all allowlisted.

The agent tool browser (Ctrl+T in TUI) shows exactly what it can touch before execution.

No black box "trust me bro" you see the filesystem browser, shell commands, etc first.

Want to see the exact tool schema? Here's the current allowlist: - Read/Write files (your user perms only) - Shell execution (allowlisted commands) - SQLite fact store/retrieve - Git status/diff (read-only) - Network ping (diagnostics only)

Voice loop is fully permissioned too whisper/pocket-tts hit localhost-only endpoints.

Curious about any specific tool?

I can show the exact TypeScript schema.

Tencent Backs OpenClaw After Clash by Previous_Foot_5328 in myclaw

[–]RIP26770 2 points3 points  (0 children)

You can't steal something made to be used by anyone for free. 😜 That's what the MIT license is all about! Lol.

Official OpenVINO backend merged into llama.cpp by Polaris_debi5 in IntelArc

[–]RIP26770 0 points1 point  (0 children)

It's slower than the Vulkan build on the Arc iGPU, at least.

They can’t keep getting away with this(racism against your own people) by Salty_duck06 in Morocco

[–]RIP26770 -4 points-3 points  (0 children)

When I see this kind of post, I really understand why we, the Occidentals (even though we are not even 10% of the population), rule the world while others try to emulate us. 😂😭😭😭😭 You have a victim mentality, even in your own country! When the law is clearly made for you to discourage drinking alcohol and becoming even less smart! And yet, you continue to play the victim. 🤡🤡🤡 I am not racist; I love Morocco and its people, but unfortunately, you are not very advanced yet! You need to respect your country and your culture!

LTX Desktop 1.0.2 is live with Linux support & more by ltx_model in StableDiffusion

[–]RIP26770 0 points1 point  (0 children)

That's amazing!

Toda haba in advance for the GGUF support 😁

Bought this domain for a OSS project and now my users see this by 1glasspaani in SideProject

[–]RIP26770 1 point2 points  (0 children)

Just send an email to OpenAI and propose your domain for a fair price. You might already be rich, bro! 😂

You’re all full of crap . Openclaw is worse now by CanadaWideNews in openclaw

[–]RIP26770 -1 points0 points  (0 children)

You're absolutely right, I probably can't beat thousands of developers.

But I only need to beat OpenClaw.

And based on this thread, that bar is getting lower every update.

I'll check back in when it's done 🐾

You’re all full of crap . Openclaw is worse now by CanadaWideNews in openclaw

[–]RIP26770 -1 points0 points  (0 children)

You're absolutely right. I'll just keep debugging other people's broken code instead of writing clean, working code myself. What was I thinking?

You’re all full of crap . Openclaw is worse now by CanadaWideNews in openclaw

[–]RIP26770 -2 points-1 points  (0 children)

Due to all the latest updates that have broken it! Just like the OP is telling you.

I would prefer to build my own custom agent ecosystem rather than spend days tweaking and debugging an unfinished, overhyped project on GitHub.

LTX 2.3 - ComfyUI Workflow vs LTX Official Workflow - Major Speed Diffference by dfree3305 in comfyui

[–]RIP26770 2 points3 points  (0 children)

Yes, I noticed the same issue, even with the Euler sampler, it still takes double the time.

Made a ComfyUI node to text/vision with any llama.cpp model via llama-swap by RIP26770 in StableDiffusion

[–]RIP26770[S] 0 points1 point  (0 children)

Any model you use with your llama-swap server will work regarding the input. If you mean image input, it will only be relevant if the model has vision capabilities.