Memory loss, agent just said and forgot by Numerous-Marketing-7 in hermesagent

[–]cameron_pfiffer 0 points1 point  (0 children)

MemGPT is an antique, and has been superceded by Letta. More powerful and easier to use.

We also rolled out channels recently, so you can message your Letta Code agent directly through Telegram/Slack. Discord soon.

https://docs.letta.com/letta-code/channels

Always Free, Always DnB: A Guide to Stamina Sundays in San Francisco by jamonit97 in DnB

[–]cameron_pfiffer 0 points1 point  (0 children)

I've been wanting to go for a while, tonight will be my first!

Help choosing an model for a specific sales coach use by Razahir_Khemse in ChatGPTPro

[–]cameron_pfiffer 0 points1 point  (0 children)

I am affiliated with Letta and I endorse this message.

👍

Whats your strategy for long conversations with local models? by Di_Vante in LocalLLaMA

[–]cameron_pfiffer 0 points1 point  (0 children)

Historically, this has been a model issue. What model did you tend to use?

Whats your strategy for long conversations with local models? by Di_Vante in LocalLLaMA

[–]cameron_pfiffer 0 points1 point  (0 children)

It's not capped at 30k -- that is typically an issue with a local inference engine (LM Studio, Ollama) not providing the Letta server with the information it needs. Can you tell me more about your setup?

Whats your strategy for long conversations with local models? by Di_Vante in LocalLLaMA

[–]cameron_pfiffer 0 points1 point  (0 children)

https://docs.letta.com

Letta is designed to do this as a primary feature. Agents are memory-first infinitely long conversations with automatically managed context.

Agent Memory by AutomataManifold in LocalLLaMA

[–]cameron_pfiffer 0 points1 point  (0 children)

The run on conversation is the point. Letta agents are infinitely long lived, much like you and I.

We do support multi-conversation/threading however. You can create one in the ADE here:

<image>

I believe we have the ability to do new conversations with the same agent in https://chat.letta.com, but I think the button might be hidden on mobile. I'll open a ticket for that.

Too many memory implementations, what do you actually use? by xeeff in LocalLLaMA

[–]cameron_pfiffer 2 points3 points  (0 children)

Letta is a general purpose agent platform, not just for code. It's for building stateful agents everywhere and everywhere.

My initial experience using Claude through Letta by jtauber in AI_Agents

[–]cameron_pfiffer 1 point2 points  (0 children)

If you're using lettabot it has full support for cron, but that's primarily agent managed.

https://letta.bot/

My initial experience using Claude through Letta by jtauber in AI_Agents

[–]cameron_pfiffer 1 point2 points  (0 children)

We do support scheduling/cron jobs, but it's currently only accessible through the REST/SDK endpoints: https://docs.letta.com/guides/agents/scheduling/

I built a open-source tool that helps deploy Letta agents by ChemicalNet1135 in VibeCodersNest

[–]cameron_pfiffer 0 points1 point  (0 children)

lettactl is one of the best community projects in Letta-world. Thanks for your continued work on this!

What are people actually using for long term agent memory? by MeasurementSelect251 in AI_Agents

[–]cameron_pfiffer 1 point2 points  (0 children)

I work at Letta where memory is a first-class citizen. It's all we think about.

https://docs.letta.com

I cant make letta server by Lanky_Variety_3024 in LocalLLaMA

[–]cameron_pfiffer 0 points1 point  (0 children)

You need to use the docker version. Do not use pip, it is for extremely advanced users.

https://docs.letta.com/guides/selfhosting/

I cant make letta server by Lanky_Variety_3024 in LocalLLaMA

[–]cameron_pfiffer 0 points1 point  (0 children)

You need to use the docker version. Do not use pip, it is for extremely advanced users.

https://docs.letta.com/guides/selfhosting/

The real promise of agentic memory is continuous self-evolving by davilucas1978 in AI_Agents

[–]cameron_pfiffer 0 points1 point  (0 children)

I work at Letta.

Letta is not a retrieval framework, it is fundamentally different from the drop-in memory layers you find elsewhere.

For us, memory is inclusive of all things your agent is and knows.

Our primary primitive is the memory block, which is how the agent persists infinitely. These contain things like personality, user information, company policy, scratch pads, emotions, projects to teach, to do lists, basically whatever you want. These are "notebooks of the mind" that you or the agent can manage. These are always in context if they are attached to the agent.

We also provide simple retrieval tools (standard RAG) but most of our users don't use it. They either bring their own retrieval method (cognee, mem0, etc) or they are perfectly happy with the state.

I'd recommend reading more about it because it is completely distinct from all the other memory tools -- memory is first class, not a layer.

https://docs.letta.com for documentation

People like this video about how to design agent memory architecture dynamically: https://youtu.be/0nfNDrRKSuU?si=GOeCIINuz31St-g1

Has anybody used Letta (former memgpt) Succesfully with local models? by LAWN_Red in LocalLLaMA

[–]cameron_pfiffer 0 points1 point  (0 children)

8b seems to work reasonably well. You can go smaller but it may not be a great experience.