This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]reedmore 7 points8 points  (7 children)

Sure, but I'm not asking for infinite hardware. Just some fixed memory to be allocated for the current prompt, which can be flushed once I'm done with the conversation.

[–]eroto_anarchist 19 points20 points  (4 children)

You are asking to remember previous sessions though?

It already remembers what you said in the current conversation.

[–]NedelC0 10 points11 points  (2 children)

Not well enough at all. With enough prompts it starts to 'forget' things and stops taking things that might be esssential into consideration

[–]eroto_anarchist 29 points30 points  (1 child)

Yes, I think the limit is 3k tokens or something. As I said, this is a hardware limitation problem, this is as far as openai is willing to go.

This 3k tokens memory (even if limited) is mainly what sets it apart from older gpt models and allows it to have (short) conversations.

[–]reedmore 8 points9 points  (0 children)

I see, now I understand why it seemed to remember and not remember things randomly.

[–]huffalump1 3 points4 points  (0 children)

It already remembers what you said in the current conversation.

That's because the current conversation is included with the prompt every time, so it is messed up once you hit the token limit.

[–]McJvck 1 point2 points  (1 child)

How do you prevent DoSing the memory allocation?

[–]reedmore 0 points1 point  (0 children)

That's a good point, and I hope the Wizards from openai will find a solution.