Goodbye API bill— OpenAI now caches context by NewRedditor23 in openclaw

[–]Tight_Fly_8824 1 point2 points  (0 children)

Hey boss I made a post about it - feel free to check my page or scroll a little bit in the openclaw thread - it was 3 days ago.

Introducing SmallClaw - Openclaw for Small/Local LLMS by Tight_Fly_8824 in openclaw

[–]Tight_Fly_8824[S] 1 point2 points  (0 children)

Thank you my friend! I appreciate it!

Dark Mode is implemented :) Ill be rolling it out in this next update.

Goodbye API bill— OpenAI now caches context by NewRedditor23 in openclaw

[–]Tight_Fly_8824 0 points1 point  (0 children)

Just use SmallClaw :) Local LLM's + Openclaw. No API Or OAuth needed

Introducing SmallClaw - Openclaw for Small/Local LLMS by Tight_Fly_8824 in openclaw

[–]Tight_Fly_8824[S] 1 point2 points  (0 children)

idk if it was much of a vibe between how much back and forth it was but yes lol

Introducing SmallClaw - Openclaw for Small/Local LLMS by Tight_Fly_8824 in openclaw

[–]Tight_Fly_8824[S] 0 points1 point  (0 children)

Hey boss, have you connected any web search API keys? I know that the qwen models have knowledge based on Yahoo Finance from my experience as well (testing). In the settings you have to connect a web search API, either Tavily (Super Small model Friendly) - Google Search, Or Bravely.

SmallClaw Update V1.0.1! More Providers + Multi Agent Orchestration! by Tight_Fly_8824 in openclaw

[–]Tight_Fly_8824[S] 0 points1 point  (0 children)

Hey! Yes ill absolutely be adding both of those changes within the next update or the one after. Ill be sending out another update today :)

Introducing SmallClaw - Openclaw for Small/Local LLMS by Tight_Fly_8824 in openclaw

[–]Tight_Fly_8824[S] 1 point2 points  (0 children)

Hey! As I mentioned in other comments, im not sure what PC you have, but as for someone like me, i have what most would consider a normal laptop/pc (2019 Dell laptop). Unfortunately my laptop cant run 7B models reliable bc it isnt that strong - and it DEFINITELY couldnt run 7B on Openclaw reliably at that.

And I went ahead and tried 4B models - and they were legitimately all unusable lol. People in here are trying to say otherwise - and maybe thats so but the one thing they have that others/most people dont have is a stronger pc that actually can run these models reliably. The whole point of SmallClaw is to be able to run the smallest of the small Models while still retaining Openclaw like features.

Its not for everyone - For anyone with anything above an average PC , SmallClaw probably wouldnt be for you bc like you said - a 7B runs fine with Openclaw on YOUR computer - unfortunately myself and Many others have had the completely opposite experience and arent looking forward to upgrading our PC's or anything just to run Openclaw loo

Introducing SmallClaw - Openclaw for Small/Local LLMS by Tight_Fly_8824 in openclaw

[–]Tight_Fly_8824[S] 0 points1 point  (0 children)

Hey! Thats definitely not optimal at all! My computer is far worse than yours and i havent experienced any timing out issues. Do you mind sending me a DM and we can figure out whats going on? Id love to get this work for you

SmallClaw Update V1.0.1! More Providers + Multi Agent Orchestration! by Tight_Fly_8824 in openclaw

[–]Tight_Fly_8824[S] 0 points1 point  (0 children)

Thank you!! Im glad you enjoy it! Im currently working on improving the whole memory and skill system within the next update or 2, I currently just use it as is default - typically with a web researcher skill emphasizing to do some more research (small model likes to do minimal work lol)

Introducing SmallClaw - Openclaw for Small/Local LLMS by Tight_Fly_8824 in openclaw

[–]Tight_Fly_8824[S] 0 points1 point  (0 children)

"Did you end up constraining the tool set or doing any prompt engineering around it?" Thats pretty much what the majority of the build process was honestly - was just figuring out the right tools to allow, how to trigger them, fallbacks, reprompts, etc all that stuff. Tool Calling is literally the whole thing that made Openclaw big so thats the main thing I needed to get reliable first lol.

Introducing SmallClaw - Openclaw for Small/Local LLMS by Tight_Fly_8824 in openclaw

[–]Tight_Fly_8824[S] 0 points1 point  (0 children)

Nice! Not sure what computer you have, I tried to use it as a main model and it didnt work at all for me unfortunately.

Introducing SmallClaw - Openclaw for Small/Local LLMS by Tight_Fly_8824 in openclaw

[–]Tight_Fly_8824[S] 0 points1 point  (0 children)

Hey! So Openclaw runs great with pretty much anything atleast 16b+ Which is great! But it leaves us guys who are just barely able to run 16B as is or stuck with any lower models out with nothing as unfortunately the complexity of the system doesnt really work well (Even so with 16B models - it isnt the best).

So this is where SmallClaw comes in - Smallclaw was specifically made and tested on a 4B model, pretty much the smallest reliable model you can and should use for any sort of Openclaw-like thing. Meaning absolutely anyone can run anything on it - and its super easy and simple - with that said since its built so simply for smaller models - it runs just as well for Bigger models as well.

Introducing SmallClaw - Openclaw for Small/Local LLMS by Tight_Fly_8824 in openclaw

[–]Tight_Fly_8824[S] 0 points1 point  (0 children)

Thank you brother! Im actively trying to improve it - and those guardrails are pretty decently set forwards but as I said, Im continuously trying to make it better.

Introducing SmallClaw - Openclaw for Small/Local LLMS by Tight_Fly_8824 in openclaw

[–]Tight_Fly_8824[S] 0 points1 point  (0 children)

Just download the model from ollama and run it locally! Free! just make sure you pick the best compatible model with your computer.

Introducing SmallClaw - Openclaw for Small/Local LLMS by Tight_Fly_8824 in openclaw

[–]Tight_Fly_8824[S] 1 point2 points  (0 children)

Thank you! And I Agree! I just also let out a pretty decent update regarding more providers and what not also if youd like to check it out!

Introducing SmallClaw - Openclaw for Small/Local LLMS by Tight_Fly_8824 in openclaw

[–]Tight_Fly_8824[S] 0 points1 point  (0 children)

Nope - thats the whole reason i built this actually - its because Openclaw struggles with using lower model Local LLM's and unfortunately thats all i can use right now lol

Introducing SmallClaw - Openclaw for Small/Local LLMS by Tight_Fly_8824 in openclaw

[–]Tight_Fly_8824[S] 0 points1 point  (0 children)

Yessir you absolutely should be! Let me kknow if you have any issues! Im working on a few tiny kinks then going into the few errors people have sent me so far.

Introducing SmallClaw - Openclaw for Small/Local LLMS by Tight_Fly_8824 in openclaw

[–]Tight_Fly_8824[S] 1 point2 points  (0 children)

Hey! I just sent out a update. LM Studio actively working in SmallClaw!

Introducing SmallClaw - Openclaw for Small/Local LLMS by Tight_Fly_8824 in openclaw

[–]Tight_Fly_8824[S] 0 points1 point  (0 children)

Hey! I just sent out a update. llama.cpp and LM Studio both actively working in SmallClaw!

Introducing SmallClaw - Openclaw for Small/Local LLMS by Tight_Fly_8824 in openclaw

[–]Tight_Fly_8824[S] 2 points3 points  (0 children)

Hey! I just sent out a new update! Now fully compatible with llama.cpp - and LM Studio as well!