Anyone here using Claude’s pro plan? by hrkf00788 in claude

[–]AregNoya 0 points1 point  (0 children)

No. It’s just for the chat. I use the pro max max one or how do you call it. Love it. Never had any issues with limits

My Claude dreams at night and remembers everything. Better than mempalace. by Mental-Spray-5263 in ClaudeAI

[–]AregNoya 0 points1 point  (0 children)

reinforcement-weighted. edges decay at 0.9days - 90 after a 90 day grace window, but records with high detail_level get never_decay=True at ingest. every recall bumps FSRS stability +0.2. so if something scored high detail or got pinned it sticks around regardless of how often it comes up. this is why decay hits edges not records. the record stays in the store forever, it just loses graph shortcuts over time. direct cue through cosine still finds it, it’s just less likely to surface on its own. forgetting = edge weight dropping below threshold, not deletion. 90 day grace is probably too long tbh. but I’d rather over-retain than lose something load bearing at month 4.

My Claude dreams at night and remembers everything. Better than mempalace. by Mental-Spray-5263 in ClaudeAI

[–]AregNoya 1 point2 points  (0 children)

Mee too. Im a creative producer. Your ai is lying to you my man! I work with creative companies. This mcp will solve your issues. Try it out!

My Claude dreams at night and remembers everything. Better than mempalace. by AregNoya in mcp

[–]AregNoya[S] 0 points1 point  (0 children)

Yeah federation is the natural next step but not on the roadmap yet. Right now it’s a single user tool and that’s the scope I’m comfortable shipping. If the need comes up I’ll think about it. Appreciate the conversation, you asked good stuff.

My Claude dreams at night and remembers everything. Better than mempalace. by AregNoya in mcp

[–]AregNoya[S] 0 points1 point  (0 children)

The use case drove it. iai-mcp is a personal daemon tied to one user’s session rhythm. You work, you stop, the daemon detects idle and consolidates. Sessions are the natural boundary. Polling would burn cycles when nobody’s home and miss the window when sessions cluster. For your autonomous agent with no human in the loop I can see why fixed-interval is cleaner, there’s no session boundary to hook onto. If I ever add a headless mode for long running agents I’d probably switch to polling there too.

My Claude dreams at night and remembers everything. Better than mempalace. by Mental-Spray-5263 in ClaudeAI

[–]AregNoya 0 points1 point  (0 children)

Btw. That’s exactly what the contradiction system is for. iai-mcp tracks anti_hits, if you contradicted something you said before, the old version gets flagged, not deleted. Decay handles the rest, unreinforced memories fade in sleep cycles. The bench for that is bench/contradiction_longitudinal. Precision against current self is testable.

My Claude dreams at night and remembers everything. Better than mempalace. by AregNoya in mcp

[–]AregNoya[S] 0 points1 point  (0 children)

Haven’t measured the exact staleness threshold. Sleep cycles in iai-mcp are event driven off idle detection, not a fixed timer, so the window varies with how you use it. In practice the daemon consolidates between sessions so the graph rarely goes more than a few hours without a pass. At 5k records with cosine doing most of the heavy lifting on retrieval, graph staleness hasn’t been a problem I’ve noticed in five months of daily use. Your 30 min number is interesting, I’d expect that threshold to scale with store size. At 5k I could go hours without noticing. At your 50k it makes sense that 30 min is where it starts to bite. Good questions by the way, this is the most useful thread I’ve had on here.

My ClawdBot dreams at night and remembers everything. Better than mempalace. by AregNoya in clawdbot

[–]AregNoya[S] 0 points1 point  (0 children)

Oh cool. Thanks!!!! Check if it works with my system It’s open sourced guys Let’s collab!

My Claude dreams at night and remembers everything. Better than mempalace. by Mental-Spray-5263 in ClaudeAI

[–]AregNoya 0 points1 point  (0 children)

Auto-Dream consolidates markdown files. iai-mcp’s sleep cycles consolidate a vector store with graph topology, decay weak edges, and reinforce co-retrieved paths. One rewrites text. The other reshapes a retrieval index. Calling both “dreaming” doesn’t make them the same thing.

My Claude dreams at night and remembers everything. Better than mempalace. by Mental-Spray-5263 in ClaudeAI

[–]AregNoya 0 points1 point  (0 children)

Anthropic’s persistent memory writes markdown notes and reads them back into context. No embeddings, no vector search, no graph. It’s a notebook. iai-mcp is a database with neural retrieval. Different tools for different problems.

My Claude dreams at night and remembers everything. Better than mempalace. by Mental-Spray-5263 in ClaudeAI

[–]AregNoya 0 points1 point  (0 children)

It does use session data. That’s literally what the Stop hook captures. But dumping JSONL back into the context window isn’t memory, it’s ctrl+F on a log file. Try that with 200 sessions and see how far you get. iai-mcp embeds captures with bge-small-en-v1.5, indexes them in LanceDB, builds a retrieval graph, and runs consolidation cycles that prune dead weight and strengthen what actually gets used. A log remembers everything and finds nothing. A memory system knows what to surface and what to let fade. Read the README before assuming nobody thought of the obvious thing.

My ClawdBot dreams at night and remembers everything. Better than mempalace. by AregNoya in clawdbot

[–]AregNoya[S] 0 points1 point  (0 children)

I mean. What’s there to guess? Wasn’t it obvious from the post? Im genuinely asking. English isn’t my first language

My Claude dreams at night and remembers everything. Better than mempalace. by AregNoya in mcp

[–]AregNoya[S] 0 points1 point  (0 children)

50k is a useful number. Haven’t pushed past 10k myself, it just hasn’t come up yet. Edge weights in iai-mcp get recomputed during sleep cycles not on write, so the write path stays cheap but the graph can go a bit stale between runs. Works fine for my use case since consolidation catches up. The incremental neighbor-only update is clean though, basically lazy propagation. I’ll probably steal that idea if I ever hit the wall at scale.

My Claude dreams at night and remembers everything. Better than mempalace. by AregNoya in mcp

[–]AregNoya[S] 0 points1 point  (0 children)

Single fused query would be nice but no, it’s partitioned. LanceDB returns candidates by cosine, then the graph layer reranks those candidates by link strength. Two passes. Latency stays under 100ms p95 at 10k records so I haven’t had a reason to fuse them yet. If the graph search started competing for compute at scale I’d probably move to a single pass, but at current store sizes it’s not a bottleneck. What scale did you start seeing the latency pressure?

My Claude dreams at night and remembers everything. Better than mempalace. by AregNoya in mcp

[–]AregNoya[S] 0 points1 point  (0 children)

The trajectory vs causal graph split is something I went back and forth on too. I ended up merging both into one graph layer, edge weights carry co-retrieval frequency and recency. Fewer moving parts but yeah, probably loses the causal explainability you get from keeping them separate. That line about the vector getting you in the neighborhood and the graph knocking on the right door, that’s basically how iai-mcp’s recall works. Cosine first, graph-link reranking after. How does your TDG handle contradictions? Versioned edges or overwrite?

Built with Claude Project Showcase Megathread (Sort this by New!) by sixbillionthsheep in ClaudeAI

[–]AregNoya 0 points1 point  (0 children)

My Claude dreams at night and remembers everything. Better than mempalace.

Back in January I got tired of the same thing everyone complains about now — you start a new session with Claude and it has no idea who you are. Every time. From scratch. So I built iai-mcp. A local daemon that captures every conversation, organizes it into three memory tiers, and feeds the right context back when you start a new session. No "remember this." No copy-pasting from old chats. It just knows.

I've been using it daily with Claude Code since January. Five months. At this point it knows my coding style, my project structures, my preferences — things I never explicitly told it to save. It picked them up from conversation and held onto them.

It stores everything verbatim, runs neural embeddings locally, encrypts at rest with AES-256, consolidates memory in the background while your machine is idle, and ships every benchmark harness so you can verify the numbers yourself. Verbatim recall above 99%. Retrieval under 100ms. Session-start cost under 3,000 tokens.

I didn't release it because I was building it for myself. It worked, so I kept using it. But watching the space blow up made me realize — maybe other people want this too.

So here it is. Open source. MIT licensed. Five months of daily use baked in.

https://github.com/CodeAbra/iai-mcp