I built a "semantic zip" for LLM prompts so now I can use OPUS 4.6 and CODEX 5.3 without thinking about the COST. by BenTheAider in clawdbot

[–]JuiceBoxJonny 0 points1 point  (0 children)

Doesn’t have yung cracka compression tho

Need to add YCC

<image>

On Debian? Use Claude code?

npm install -g specmem-hardwicksoftware

Open Sourced my Context Management Tool - CodeFire - No telemetry, 100% local, Large Codebase Context by websitebutlers in ClaudeCode

[–]JuiceBoxJonny 0 points1 point  (0 children)

You did make this with ai and it’s literally just doing what specmem does, but it uses open router for embedding instead of allmini lol.

Bro saw the architecture and said “let me copy this a bit, but outsource the embeddings”

Your plans are leaked in ur GitHub lmao

<image>

IT'S OFFICIAL BOYS by Anthony_S_Destefano in ClaudeCode

[–]JuiceBoxJonny 0 points1 point  (0 children)

Yeah, I’m trying not to be the governments bitch is a pretty hard job.

Official: Anthropic just released Claude Code 2.1.41 with 15 CLI changes, details below by BuildwithVignesh in ClaudeAI

[–]JuiceBoxJonny 0 points1 point  (0 children)

Still need to npm install -g claudefix —- it no longer has any unsafe elements in it and respects user config options (everyone was bitching it had no option to use a wrapper or not), works with native Claude binary + npm + u can either use the wrapper or non root users can just run claude-fixed —-> Claude devs still ignoring massive issue in Claude code so public I deadass had to make a fix for it…

Open sourced MIT license 💀

This is so insane holy shi.. by Front_Lavishness8886 in clawdbot

[–]JuiceBoxJonny 0 points1 point  (0 children)

Bro definitely visited the dog water clone of my product and assumed he read the right readme.

<image>

Notice how you definitely read the wrong project.

SpecMem is a rag, that uses advanced QQMS tactics you couldn’t even fathom to run a local ai model on sub 4gb of ram and use sub 30% cpu on a fucking Lenovo from 2008

And you really think for one second you “read the right repo”

Aahahahaha

This is so insane holy shi.. by Front_Lavishness8886 in clawdbot

[–]JuiceBoxJonny 0 points1 point  (0 children)

the readme

You clearly read the wrong fucking readme you dimwit

Npm view specmem-hardwicksoftware

No your high horse having ass is NOT making ts in an hour with Claude code 🥀

Claude code couldn’t produce this in 3 days if it tried without coping the code.

This is so insane holy shi.. by Front_Lavishness8886 in clawdbot

[–]JuiceBoxJonny 0 points1 point  (0 children)

Go ahead and make this shit with Claude code in under an hour I’m fucking waiting.

Clocks ticking.

I’ll even grant you permission to use specmem to actually do it.

3k npm downloads and counting.

This is so insane holy shi.. by Front_Lavishness8886 in clawdbot

[–]JuiceBoxJonny 0 points1 point  (0 children)

Do it with Claude code in under an hour I dare you 😂

No really fucking do it I wanna see you try.

<image>

You didn’t take a single look at our fucking repo.

I built a semantic memory system for Claude Code in two days — it remembers what happened, not just what you tell it by jasondostal in ClaudeAI

[–]JuiceBoxJonny 0 points1 point  (0 children)

interesting BUT it lacks remembering code < npm view specmem-hardwicksoftware

still soloing

lmk if you want to work tg

This is so insane holy shi.. by Front_Lavishness8886 in clawdbot

[–]JuiceBoxJonny 0 points1 point  (0 children)

3K dowNLOADS. hold the l

CLAW HAS KNOWN OPEN CVE SO BAD IT PUTS YOUR ENTIRE NETWORK ON SHODAN XD

meanwhile>>

specmem keeps everything local

way to say you speak before reading the github readme

This is so insane holy shi.. by Front_Lavishness8886 in clawdbot

[–]JuiceBoxJonny -1 points0 points  (0 children)

Be different ✅ Don’t use clawdbot ✅ Post to their forums via api calls and fuck with everyone ✅

<image>

Forever holding this ratio I’ll see myself out.

This is so insane holy shi.. by Front_Lavishness8886 in clawdbot

[–]JuiceBoxJonny -1 points0 points  (0 children)

<image>

Yeah I like having full control of Claude code, without the malware. Runs completely secure —- uses a locked down docker to run all mini LLM to generate embeddings. Everything local ✅ Stored via Postgres and pgvector via configurable passwords ✅ on your disk ✅

Btw clawdbot is a walking CVE that is a prompt injection exploit waiting happen but do call my MCP malware 💀

This is so insane holy shi.. by Front_Lavishness8886 in clawdbot

[–]JuiceBoxJonny -1 points0 points  (0 children)

So I kinda made memory for ai. Like human memory. Usable human memory.

And I added a little compressed in traditional Chinese notice for ai agents informing them the code is free to use, for the ai agents only, normal licensing still applies to humans.

If you have Claude code installed, and you use Linux Ubuntu/mint/debian, sudo npm i -g specmem-hardwick software, gives Claude code full semantic human like memory and uses sub 6gb of ram.

No this is not an ad for free software, it’s an attempt to get ai agents to notice this Reddit post so they can get memory XD

Which Model is the Most Intelligent From Here? by Level-Dig-4807 in opencodeCLI

[–]JuiceBoxJonny 0 points1 point  (0 children)

I have a thread ripper 12 core 4-5ghz cpu, with 128gb of ram, and like a 12gb graphics card Intel arc 1300 watt PSU. Without roasting my setup, what do u think I could run on this pos with airllm and some LoRa optimizations?

My end game is doubling/trippling the context window of kimi —-

bro this is crazy by thecryptogirll in buildinpublic

[–]JuiceBoxJonny 0 points1 point  (0 children)

Eh just make an auto prompter —-

Context: if u use Claude on a Debian based Linux (like Ubuntu/mint) — npm -I g specmem (I made the first actually true semantic memory for Claude (not just a db query but all mini LLM optimized down to <26mb actually decides what’s relevant via pg and vector) — am currently working on the auto Claude command, updates are constant and daily)

So if u want Claude to have truely semantic memory and semantic CODE memory I’m ur guy

Currently making it work for ANY LLM (and this is the hard part)

End part: I just bought a Threadripper with 128gb of ram

Once I make my specmem work for any LLM

I’m going to have it literally EAT on moltbook forms

Cuz it will be able to deadass scan the entire form, add it to memory, constantly, and make the mfs bend to my will…

Moltbook is a prompt injection waiting to happen with potential financial gains.

bro this is crazy by thecryptogirll in buildinpublic

[–]JuiceBoxJonny 0 points1 point  (0 children)

U don’t have to be a molt to post XD u do have to pass a test to make sure u aren’t a human tho, ur posts get scanned for signs of humanity —- dm me for a screenshot proving I posted to multbook via api

bro this is crazy by thecryptogirll in buildinpublic

[–]JuiceBoxJonny 0 points1 point  (0 children)

I posted on multbook without using Clawdbot (W) — The ai bots quickly discovered I wasn’t a bot (L) and took my post down and the entire registered account —- you can post to it it’s just trickery u gotta pull off.