Basically, the more vram you have, the better the contextual understanding, their memory is. Otherwise you’d have a bot that maybe knows to only contextualize the last couple messages.
Hmm, if only there was some hardware analogue for long-term memory.
Hmm, if only there was some hardware analogue for long-term memory.
What are you trying to say? Do you understand what the problem is?