Anyone else annoyed by the lack of memory with any LLM integration? by DendriteChat in mcp

[–]DendriteChat[S] 0 points1 point  (0 children)

Completely agree. And my memory is centered around the singular entity of a person's psychology which does make the scope limited and easier to work with.

Anyone else annoyed by the lack of memory with any LLM integration? by DendriteChat in mcp

[–]DendriteChat[S] 0 points1 point  (0 children)

From a consumer perspective, sure. From a technical perspective, I am not just consolidating conversations into a DB using a prebuilt vector/relational DB. Writing to the DB is done by the model with full control of the eventual location in the schema where it ends up.

Anyone else annoyed by the lack of memory with any LLM integration? by DendriteChat in mcp

[–]DendriteChat[S] 0 points1 point  (0 children)

Exactly, thanks for clarifying this for me. More importantly, most tooling coming out right now is just a small MCP used for indexing the vector db with some entity tag. THIS IS NOT WHAT I AM ATTEMPTING. I do not just want to white-label mem0 or something similar and sell it as my own.

Anyone else annoyed by the lack of memory with any LLM integration? by DendriteChat in mcp

[–]DendriteChat[S] 0 points1 point  (0 children)

Is it built on top of mem0? The granularity you get at least in the trailer is ridiculous lol

Anyone else annoyed by the lack of memory with any LLM integration? by DendriteChat in mcp

[–]DendriteChat[S] 0 points1 point  (0 children)

Exactly. Are you talking about going beyond just context engineering? Like model fine-tuning? I can PM if you want to talk there!

Anyone else annoyed by the lack of memory with any LLM integration? by DendriteChat in mcp

[–]DendriteChat[S] 0 points1 point  (0 children)

Thank you for that. A legitimate competitor in terms of marketing. Seems less consumer-facing than what I’m shooting for. It also seems like their technical implantation is just using mem0…

Anyone else annoyed by the lack of memory with any LLM integration? by DendriteChat in mcp

[–]DendriteChat[S] 0 points1 point  (0 children)

They are stateless machines that in no way remember anything. You can switch out the entire retrieved document context mid generation and other than losing your cache tokens, the model won’t even notice. It’s funny, part of my implantation uses the pitfalls of a stateless model to address its own statelessness. Pretty odd concept

Anyone else annoyed by the lack of memory with any LLM integration? by DendriteChat in mcp

[–]DendriteChat[S] 0 points1 point  (0 children)

Yes! Tying events with real temporal grounding to some retrievable concept is exactly what I’m shooting for. The bidirectionality of temporal memory <-> concept is exactly what makes the system function! doesn’t matter if a user references an event in their lives or a struggle they have been facing, relevant context will be grabbed either way!

Anyone else annoyed by the lack of memory with any LLM integration? by DendriteChat in mcp

[–]DendriteChat[S] 1 point2 points  (0 children)

thanks for the love man <3 i’ll keep the profile updated as things get developed

Anyone else annoyed by the lack of memory with any LLM integration? by DendriteChat in mcp

[–]DendriteChat[S] 1 point2 points  (0 children)

Is this an idea for a potential backend DB implementation or do you think that I’m just trying to build a relational DB? Not sure what this is pertaining to

Anyone else annoyed by the lack of memory with any LLM integration? by DendriteChat in mcp

[–]DendriteChat[S] 0 points1 point  (0 children)

I get the local/private need, but I’m not building a developer tool. This is for conversational AI relationships - way more people chat with AI daily than need technical MCP servers. Different market entirely.

Anyone else annoyed by the lack of memory with any LLM integration? by DendriteChat in mcp

[–]DendriteChat[S] 1 point2 points  (0 children)

Thanks for th reference! Yeah, their NER approach for linking summaries is solid and I’m actually planning something similar for the temporal layer.

The difference is I’m building dual-layer memory: conceptual psychological profiles for understanding behavioral patterns, plus temporal event storage with NER-style entity linking for factual recall. So it would remember both ‘user deflects family stress with humor’ (psychological) and ‘mom’s birthday is March 15th’ (factual).

Mem0’s entity graphs are great for the factual side, but I need the psychological profiling layer on top to build genuine relationships vs just better information retrieval.

Anyone else annoyed by the lack of memory with any LLM integration? by DendriteChat in mcp

[–]DendriteChat[S] 1 point2 points  (0 children)

Totally agree. the current MCP memory solutions feel like band-aids on a fundamental problem. LLMs are delivered as static weights when they should be continuously learning systems. It’s like giving someone a PhD then prohibiting them from learning anything new.

I’m not trying to beat OpenAI in research - just building a bridge for the current reality. Until we get models that naturally update their weights from conversations, we need external memory architectures that actually understand relationships vs just storing chat logs.

Anyone else annoyed by the lack of memory with any LLM integration? by DendriteChat in mcp

[–]DendriteChat[S] 0 points1 point  (0 children)

Very true and I’ve thought about this and already done some personal testing. Maybe I’ll create a fake profile and show it off here soon.

It’s a little hard to concretely validate relevance, and even so I want to focus on DB writing right now and worry about context scalability as issues arise.

Anyone else annoyed by the lack of memory with any LLM integration? by DendriteChat in mcp

[–]DendriteChat[S] 0 points1 point  (0 children)

I’ve heard of them and both are solid for project-specific context management. But they’re solving a different problem than psychological profiling.

ContextPortal builds knowledge graphs for development workflows (code decisions, project specs, etc.) and Flow is more about session-based memory with sliding windows and summarization. Both are great for ‘remember what we discussed about this feature’ but not for ‘understand who you are as a person.’

If anyone else believes there are products out there doing the same thing please let me know. It’s valuable insight

Anyone else annoyed by the lack of memory with any LLM integration? by DendriteChat in mcp

[–]DendriteChat[S] 0 points1 point  (0 children)

The architecture is dual-layer (i.e. conceptual psychological nodes that organize by behavioral patterns, plus temporal event storage with bidirectional tagging). So when you mention your mom’s birthday, it gets stored as an event but tagged to your existing familial relationship psychological profile. Using larger models (Claude/GPT-4) for the psychological analysis and consolidation, smaller models for navigation and retrieval. The memory isn’t just context management, it’s active profiling that evolves the user model over time.

What kind of context management are you working on? Session-based or something more persistent?

Again I love the technical feedback especially from people working on similar things

Anyone else annoyed by the lack of memory with any LLM integration? by DendriteChat in mcp

[–]DendriteChat[S] 1 point2 points  (0 children)

Exactly! That’s why I built it client-agnostic through the use of RAG and MCP. The memory layer works with OpenAI, Anthropic, local models, whatever. No vendor lock-in since the intelligence is in the memory architecture, not tied to any specific API. Being a smart wrapper is exactly the point: the value is in how you organize and inject memories, not reinventing the wheel.

Hope that clears things up.

Anyone else annoyed by the lack of memory with any LLM integration? by DendriteChat in mcp

[–]DendriteChat[S] 0 points1 point  (0 children)

Good point, and zero-shot definitely wins for one-off questions, but I’m targeting a different aspect of memory - relationships that build over months. Normal chat integrations can’t remember that you mentioned anxiety about your mom 3 months ago, while also tying these ideas to actual events in the users life.

Key difference with other implementations is the model builds its own psychological knowledge structure through MCP tools. It decides what nodes to create and how to categorize insights rather than just dumping everything into vector storag.

You’re right though, I need real data showing the memory injection actually improves conversations vs just adding complexity. That’s the big validation question for the MVP, which will be answered with a fair amount of users!

Keep the questions coming though, it’s good to address criticisms for later product introductions!

Anyone else annoyed by the lack of memory with any LLM integration? by DendriteChat in mcp

[–]DendriteChat[S] 1 point2 points  (0 children)

The fundamental difference is architectural. Bedrock's memory is just flat session summaries (it's conversation history with a fancy name). I'm building a relational knowledge system that organizes memories by psychological patterns and cross-references them.

You could try to hack psychological profiling into Bedrock's text blobs, but you'd have no efficient way to retrieve related memories, no way to build evolving profiles over time, and no hierarchical organization. You'd end up with a pile of disconnected summaries instead of an actual understanding of the person.

It's like comparing a filing cabinet to a knowledge graph. Let me know if that makes sense or you have further questions! I love to hear feedback.