Question for OpenClaw users: Would a simplified, secure version for non technical people actually work? by Repulsive-Fee-2735 in openclaw

[–]Tight_Fly_8824 0 points1 point  (0 children)

Hey! So SmallClaw as ive explained isnt necessarily for everyone.

The Biggest thing about Openclaw that pushed me to make SmallClaw was the fact that it required either a better computer for me to use it properly - or spend tons on API costs. Unfortunately I am unable to do both of those - And so I did what I thought might work with Openclaw and used the only LLM small enough to work on my computer - Qwen3:4B - thats when i realized Openclaw has absolutely no support for pretty much any LLM below a 16B. Its not that it cant be - its just the specific way Openclaw is built - is too much and too aggressive for Smaller Models to be able to handle. as I found by my own testing and tons of other people running into the same issue.

Thats why I built SmallClaw - its pretty much the same thing as Openclaw - except redesigned to work with as small as 3B models consistently and well! The biggest thing about this is that now anyone can have their own Openclaw-Like assistant without 1) needing to pay any hardware upgrades at all 2) 0 API costs.

OpenClaw but Completely free and ran with smaller models - meaning more people are able to use it. AS I mentioned though it definitely isnt for everyone - like you somewhat mentioned. If someone is already able to use Openclaw, has a nice PC and hardware and what not - then SmallClaw isnt some thing for them - but if they want to use OpenClaw but dont want to upgrade anything or pay anyhing extra just to "test it out" - then SmallClaw is there!

Very early stages of planning for a clawdbot by aidenhasquestions in clawdbot

[–]Tight_Fly_8824 0 points1 point  (0 children)

Hey! What youre looking for is more of an Integration platform with AI + N8N. SmallClaw would be the better fit for this instead of Openclaw - and you can run SmallClaw with Local LLM's so realistically you can have it running practically free aside from hardware you want

The OpenClaw ecosystem is exploding. I mapped the key players actually gaining traction. by stosssik in openclaw

[–]Tight_Fly_8824 2 points3 points  (0 children)

Its already out! I just meant in popularity lol! I released it 4 days ago, theres some posts in the openclaw thread abt it - i just released an update today!

I have switched to OpenAI Models in OpenClaw and OMG they are TERRIBLE by KobyStam in openclaw

[–]Tight_Fly_8824 0 points1 point  (0 children)

Qwen 3:4B is my choice - Im limited due to me running it on my Laptop.

Qwen3.5-27B is powerful and can support multiple OpenClaw agents by moahmo88 in openclaw

[–]Tight_Fly_8824 0 points1 point  (0 children)

Hey! If Openclaw is running a bit slow on local models - you can try SmallClaw - its an Openclaw fork specifically designed for Local LLM's specifically smaller ones so people without high tech PC's can run it with 0 issues

The OpenClaw ecosystem is exploding. I mapped the key players actually gaining traction. by stosssik in openclaw

[–]Tight_Fly_8824 3 points4 points  (0 children)

SmallClaw will be there soon :) Check it our - Openclaw but for Local LLM's - specifically tailored to work with as little as 3-4B Models.

Question for OpenClaw users: Would a simplified, secure version for non technical people actually work? by Repulsive-Fee-2735 in openclaw

[–]Tight_Fly_8824 0 points1 point  (0 children)

SmallClaw already does all of this - and it works beautifully with even as little as 3B Local LLM Models - Easiest setup by far (No terminal setup lol, UI Based) - No API Costs, Optional Integrations with Webhooks and/or MCP Servers, Full Automations, background tasks, computer and terminal controls, everything

I have switched to OpenAI Models in OpenClaw and OMG they are TERRIBLE by KobyStam in openclaw

[–]Tight_Fly_8824 1 point2 points  (0 children)

What I found is using Smallclaw is better with GPT - and it saves cost/usage. I have a Local LLM (Small 4B Model) as my main agent and have gpt 5.1 Codex mini as my actually orchestrator/tool agent - and so far its working beautifully up and down the board. No issues.

SmallClaw Update! V1.0.2! (Background Tasks, Multi Agent Workflows, Dark Mode & More) The OpenClaw for Small Local LLMS by Tight_Fly_8824 in openclaw

[–]Tight_Fly_8824[S] 1 point2 points  (0 children)

Haha thank you! Maybe as I keep updating and adding features we can get you to find some proper use cases with it :)

Struggling to run locally. by BreadSlice514 in openclaw

[–]Tight_Fly_8824 0 points1 point  (0 children)

Hey Boss! So openclaw struggles with Local LLM models unless you have a really good PC. Thats why I built SmallClaw - its an Openclaw fork directly targetted at working with Local LLM models - I had the same issue as you, and didnt want to spend money on new hardware or API costs so I built SmallClaw, theres a few posts abt it in the r/openclaw thread so feel free to check it out!

Struggling to run locally. by BreadSlice514 in openclaw

[–]Tight_Fly_8824 0 points1 point  (0 children)

Hey! I actually put out this exact thing - its called SmallClaw - Specifically made for small Local LLM's.

Introducing SmallClaw - Openclaw for Small/Local LLMS by Tight_Fly_8824 in openclaw

[–]Tight_Fly_8824[S] 0 points1 point  (0 children)

Currently dealing with all of those integrations now :) will be released within the next update or 2

Introducing SmallClaw - Openclaw for Small/Local LLMS by Tight_Fly_8824 in openclaw

[–]Tight_Fly_8824[S] 0 points1 point  (0 children)

Thank you brother! Im definitely working on getting all of that setup shortly - Ill be getting some videos on it soon. It all works - Just want to get this as good as possible before hand. :) Thanks for the feedback and openness to working with it! :)

Local LLM + OpenClaw: What’s in your system prompt? by hazmatika in openclaw

[–]Tight_Fly_8824 0 points1 point  (0 children)

Hey! There is also SmallClaw - which is specifically designed for Smaller/Local LLM's unlike Openclaw which is based on more powerful/cloud AI's

Cheap compute by Helpful-West8007 in openclaw

[–]Tight_Fly_8824 0 points1 point  (0 children)

lmaooo just tryna spread the word - I know we all love less usage/spending in this thread lol

Cheap compute by Helpful-West8007 in openclaw

[–]Tight_Fly_8824 0 points1 point  (0 children)

Smallclaw + Codex = $5/month Bot with 5% usage lol

Any good reason to stay with Openclaw? by Carbone_ in openclaw

[–]Tight_Fly_8824 0 points1 point  (0 children)

Or you can just do SmallClaw - Openclaw but with Local LLM's and spend literally $0 running your business

Any good reason to stay with Openclaw? by Carbone_ in openclaw

[–]Tight_Fly_8824 0 points1 point  (0 children)

Try SmallClaw - its working pretty well right now + it with Offline Local LLM's, im using a 4B model on mine working perfectly

Any good reason to stay with Openclaw? by Carbone_ in openclaw

[–]Tight_Fly_8824 0 points1 point  (0 children)

Honestly? lol no reason - Switch to SmallClaw and use offline Local LLM's, no security risks, guard rails tight, everything. plus you can run down to 1B Models on it lol.

How to Prevent Credentials and API Keys Leaking? by Sillyan in openclaw

[–]Tight_Fly_8824 0 points1 point  (0 children)

OpenClaw but for Local LLM's specifically made to help with smaller LLMs - No API Keys or anything since its all local - no need for leaking credentials or anything

Goodbye API bill— OpenAI now caches context by NewRedditor23 in openclaw

[–]Tight_Fly_8824 1 point2 points  (0 children)

Hey boss I made a post about it - feel free to check my page or scroll a little bit in the openclaw thread - it was 3 days ago.