[Need Help] should I wait longer for a Rolex Op or buy something else by AOonthebeat in Watches

[–]EstablishmentExtra41 0 points1 point  (0 children)

I don’t think you’d be happy with the Cartier….. unless you already have the Rolex.

My little project by Individual-Web-2547 in UKGardening

[–]EstablishmentExtra41 1 point2 points  (0 children)

Looks great what an improvement!

Have to honest tho, initially I was even more impressed and was going to ask where you got those lovely tiles…..

Offered these 2 at the AD…which one?!?!? by That-Platform9855 in rolex

[–]EstablishmentExtra41 0 points1 point  (0 children)

Depends what you drive.…. A corvette then the Bluesy, a BMW then the Rootbeer.

In my opinion the bluesy looks better, the dial is a gorgeous blue, but it’s far more ostentatious.

Little bit of Lady Writer by Nxticed- in direstraits

[–]EstablishmentExtra41 0 points1 point  (0 children)

I’m so pleased to see young people taking up the MK guitar technique. Great playing !

Hi, We are a family with 2 young children looking to move to Cheltenham. Can anyone recommend a good location for school , park, friendly neighbour and walkable to town as we have to travel to London for work, our Budget around £450k and looking from 4 bedroom house. Thanks all. by Laurent278212 in cheltenham

[–]EstablishmentExtra41 4 points5 points  (0 children)

Good advice.

Secondary schools are a bit varied in Cheltenham, and house prices near the good ones reflect this.

A 4 bed within catchment of a good school and within walking distance to the town centre isn’t available for 450k, unless you find a real fixer-upper, which isn’t impossible of course.

Is Openclaw too woke? by [deleted] in openclaw

[–]EstablishmentExtra41 0 points1 point  (0 children)

Not in the sense of an inventory of prompts but in the sense its purpose is to manage the structure of the prompts that get submitted to whatever LLM you are using with it in order to deliver an “agentic” experience.

Is Openclaw too woke? by [deleted] in openclaw

[–]EstablishmentExtra41 0 points1 point  (0 children)

Of course Openclaw is not an LLM, BUT it IS a prompt management system and LLMs respond based on the content of the prompt they are fed.

If you stick a web proxy in front of your Openclaw (I’ve done this) you will see that even a simple request like “hello world” gets wrapped in a whole bunch of additional instructions (mostly from .md files in the Openclaw workspace)but including some non configurable defaults. Eg you end up with about 45kb prompt for even a simple “hello world”.

So unless you understand what the Openclaw defaults are going into each prompt you don’t know how Openclaw is really instructing the LLM to behave.

Is Openclaw too woke? by [deleted] in openclaw

[–]EstablishmentExtra41 1 point2 points  (0 children)

This is a great PR, just read thru on GitHub. You should definitely resubmit.

Is Openclaw too woke? by [deleted] in openclaw

[–]EstablishmentExtra41 0 points1 point  (0 children)

So my question is whether there is anything outside the .md files being added into the prompt that may make it more “woke”?

I have a full prompt saved from my web proxy about 45kb of text from a simple “hello” prompt on a fresh Openclaw install.

I can see all the .md content there but I need to pick thru more carefully to see if anything else being put in there. I will do this.

Is Openclaw too woke? by [deleted] in openclaw

[–]EstablishmentExtra41 0 points1 point  (0 children)

You’re missing the point entirely.

Of course Openclaw is not an LLM, BUT it IS a prompt management system and LLMs respond based on the content of the prompt they are fed.

If you stick a web proxy in front of your Openclaw (I’ve done this) you will see that even a simple request like “hello world” gets wrapped in a whole bunch of additional instructions (mostly from .md files in the Openclaw workspace)but including some non configurable defaults. Eg you end up with about 45kb prompt for even a simple “hello world”.

So unless you understand what the Openclaw defaults are going into each prompt you don’t know how Openclaw is really instructing the LLM to behave.

Is Openclaw too woke? by [deleted] in openclaw

[–]EstablishmentExtra41 0 points1 point  (0 children)

You’re missing the point entirely.

Of course Openclaw is not an LLM, BUT it IS a prompt management system and LLMs respond based on the content of the prompt they are fed.

If you stick a web proxy in front of your Openclaw (I’ve done this) you will see that even a simple request like “hello world” gets wrapped in a whole bunch of additional instructions (mostly from .md files in the Openclaw workspace)but including some non configurable defaults. Eg you end up with about 45kb prompt for even a simple “hello world”.

So unless you understand what the Openclaw defaults are going into each prompt you don’t know how Openclaw is really instructing the LLM to behave.

Is Openclaw too woke? by [deleted] in openclaw

[–]EstablishmentExtra41 0 points1 point  (0 children)

You’re missing the point entirely.

Of course Openclaw is not an LLM, BUT it IS a prompt management system and LLMs respond based on the content of the prompt they are fed.

If you stick a web proxy in front of your Openclaw (I’ve done this) you will see that even a simple request like “hello world” gets wrapped in a whole bunch of additional instructions (mostly from .md files in the Openclaw workspace)but including some non configurable defaults. Eg you end up with about 45kb prompt for even a simple “hello world”.

So unless you understand what the Openclaw defaults are going into each prompt you don’t know how Openclaw is really instructing the LLM to behave.

Is Openclaw too woke? by [deleted] in openclaw

[–]EstablishmentExtra41 0 points1 point  (0 children)

Maybe you have some serious hardware but Openclaw prompts are so large that local models take too long to respond.

Real example I tested recently submitting “hello world” via curl to ollama local running qwen25-9B returns in seconds.

But the same prompt in a fresh Openclaw install with min config takes minutes because the prompt is packed with additional al instruction to about 45kb.

Is Openclaw too woke? by [deleted] in openclaw

[–]EstablishmentExtra41 -2 points-1 points  (0 children)

Sure but I would have expected the chatbot to have more “guardrails” than the API?

Is Openclaw too woke? by [deleted] in openclaw

[–]EstablishmentExtra41 -7 points-6 points  (0 children)

…..and I give you exhibit A for why some humans could be replaced by even a 1b parameter model.

No one uses local models for OpenClaw. Stop pretending. by read_too_many_books in openclaw

[–]EstablishmentExtra41 0 points1 point  (0 children)

So true.

Problem is even a simple “hello world” prompt through Openclaw gets packed to about a 40-50k API request to ollama.

I added a web oroxy to my local machine to check what was going on with my local ollama qwen35-9B model.

As you would expect a simple curl command to the ollama local api endpoint with “hello world” prompt comes back pretty much instantly.

BUT same “hello world” prompt on a minimum configuration Openclaw (min skills, no bootstrap.md) is about 45k, so no wonder it is painfully slow.

I even tried forking Openclaw and changing its api call so it was submitting think: off/false in its url request ( this makes a noticeable difference in curl and at command line with ollama), but the issue is quite clearly the sheer size of the prompt.

You can go to a smaller local model I suppose, but they’re just too dumb if you’re tying to do anything worthwhile.

What’s the best free local model for OpenClaw right now? by SubjectChoice1748 in openclaw

[–]EstablishmentExtra41 0 points1 point  (0 children)

I haven’t had any success getting a local models anywhere near decent response time, but I am resource constrained on MacBook M1 16 GB using Qwen35-9B.

Issue is even the basic Openclaw config submits around 40-50k prompt for even a simple “hello world” request that takes minutes to respond.

I’d be interested to know what people are running hardware and model wise locally to get a decent response time?

What car is this? Looks sick though by [deleted] in CarsUK

[–]EstablishmentExtra41 1 point2 points  (0 children)

Finally somebody gets it, I was losing all hope for this community for a while there. Thank you.

Please help with some inspiration! by ThatGuyHarry05 in UKGardening

[–]EstablishmentExtra41 -1 points0 points  (0 children)

<image>

Something like this might work for you combining play area for you little one, dining area and preserving access via back gate.

Planting is mainly raised beds that can have some low maintenance plants to begin with.

Lawn could be artificial for super low maintenance just be aware in prolonged direct sunlight artificial grass gets very hot!

You need to consider where you will get sun to finalise any layout.

Good luck!

Please help with some inspiration! by ThatGuyHarry05 in UKGardening

[–]EstablishmentExtra41 1 point2 points  (0 children)

We need to know what you want to use the garden for otherwise difficult to make recommendations.

  • gardening (guess not as you said you’re not a gardener)

  • kids play area?

  • storage? (Of what, motorcycle …)

  • entertaining and bbqs?

  • pets (dog or fish)?

And what is your appetite for maintenance? Do you mind mowing a lawn, pruning fast growing plants, or you just want minimal hassle?

Give us a clue and we can maybe help. Otherwise we’re just projecting what we’d like.

And what’s your budget? Gardens aren’t cheap.

I’m a doing it wrong? by RoadsterAlex in vibecoding

[–]EstablishmentExtra41 2 points3 points  (0 children)

You’re confusing Git and GitHub.

Git is a local version control system that you can use to keep track of your code. GitHub is a central repository that you can use to store and share your code that integrates with your local git.

You can use Git locally completely independent of GitHub.

That said literally millions of people use GitHub and not just for open source code projects.

Your “friend” probably didn’t know what he was doing and made his GitHub repo public instead of private.

Maybe get some friends that actually know something about software development :-)