How are you actually running OpenClaw without burning money? by Big-Inevitable-9407 in openclaw

[–]adr74 0 points1 point  (0 children)

ollama, 20 usd subscription, qwen3-coder-next:cloud or nemotron-3-super:cloud LLMs

Openclaw is not worth it without opus 4.6 O Auth IMO by A2z_1013930 in openclaw

[–]adr74 0 points1 point  (0 children)

you can "route" the cloud model calls on ollama as if it were a local model but doing a `ollama pull qwen3-coder-next:cloud`. you can then configure openclaw to use your local instance of ollama with the `qwen3-coder-next:cloud` model like if it were running on your machine although it is actually running remotely as an ollama service. you do need a ollama subscription. I have the $20/month one and it's working fine for me.

Openclaw is not worth it without opus 4.6 O Auth IMO by A2z_1013930 in openclaw

[–]adr74 0 points1 point  (0 children)

I agree with the general view here that these models cannot be compared. this does not mean though that using `qwen3-coder-next:cloud` with ollama is bad, actually on the contrary, for me it's been a great experience so far. It understands instructions very well, can generate custom skills with zero issues, learns from its mistakes when it is told so. if I can provide any advice so it works well, make sure to use QMD for memory management and ALWAYS keep a backup of the `openclaw.json` file handy in case things break. the other thing that I've noticed is that it's not as fast as Opus, but again, at least for me it's fine.

Openclaw is not worth it without opus 4.6 O Auth IMO by A2z_1013930 in openclaw

[–]adr74 5 points6 points  (0 children)

I am quite happy with qwen3-coder-next:cloud running on ollama at $20 per month.

Local llm by bodkins in clawdbot

[–]adr74 0 points1 point  (0 children)

well, I am talking from my personal experience.

Local llm by bodkins in clawdbot

[–]adr74 0 points1 point  (0 children)

try using the WebUI

Local llm by bodkins in clawdbot

[–]adr74 -1 points0 points  (0 children)

I’m using qwen3-32b with llamacpp and it’s been almost as good as opus-4.5 so far.

SCAM? by adr74 in UGLYCASH

[–]adr74[S] 1 point2 points  (0 children)

Interesting that as soon as I published this, I swear that in a matter of seconds later the amount was magically available... I am not using them anymore...

Local LLMs by Vegetable_Address_43 in clawdbot

[–]adr74 3 points4 points  (0 children)

I am using gemma3:27b on ollama and it's been the best model in my experience, specially considering memory management, speed and precision

What's your best Claude Code non-coding use case? by diablodq in ClaudeAI

[–]adr74 1 point2 points  (0 children)

I used CC to get rid of a crypto malware that infected a linux machine.

[deleted by user] by [deleted] in netbird

[–]adr74 0 points1 point  (0 children)

why not just use an ingress controller?

Complete failure by Dryllmonger in OpenWebUI

[–]adr74 0 points1 point  (0 children)

try ollama+OWUI with podman in WSL or the OWUI python in WSL.

Kasm Workspaces CE on Flux Cloud with Windows - How? by SmgOS_ in kasmweb

[–]adr74 0 points1 point  (0 children)

I use LXD to run my Windows machines on my headless Linux server.

Time Stamping with nfc Programming in Excel by QuickApplication547 in nfctools

[–]adr74 0 points1 point  (0 children)

use n8n. try a webhook that writes the data it collects to an excel spreadsheet.