Bridging the gap between complex scientific research and the curious minds eager to explore it.

Computer Science, Robotics

Recursive Summarization Enables Long-Term Dialogue Memory in Large Language Models

Recursive Summarization Enables Long-Term Dialogue Memory in Large Language Models

In this article, we propose a novel approach to maintaining conversation history in dialogue systems, addressing the challenge of memory overflow due to continuous storage of historical content. Our proposed method utilizes large language models (LLMs) to iteratively generate summaries of past conversations, seamlessly integrating recent interactions and past summaries into a cohesive historical record. This system ensures that both recent interactions and past summaries are updated without causing memory overflow.

Background

Conversational dialogue systems have become increasingly popular in various applications, including virtual assistants, language translation, and customer service. However, these systems face a significant challenge in managing the history of conversations, which can lead to memory overflow when storing every detail of past interactions. This problem becomes more pronounced as the number of conversation turns increases.
Existing approaches to address this issue involve either discarding previous dialogues or relying on external databases to store summaries of past interactions. However, these methods do not provide a comprehensive solution and can result in compromising the quality of the dialogue system.
Our proposed method leverages LLMs to generate summaries of past conversations, ensuring that recent interactions and past summaries are integrated into a cohesive historical record without causing memory overflow. This approach allows for efficient management of conversation history while maintaining the accuracy and relevance of previous interactions.

Methodology

Our proposed method consists of two primary components: (1) storage of conversation histories with the current prompt, and (2) storage of summarized histories of conversations with previous prompts. This dual-storage approach enables the system to efficiently manage conversation history without compromising the quality of the dialogue.
When the prompt is updated, the current conversation content is combined with previously summarized material to create a newly summarized history. This approach ensures that both recent interactions and past summaries are seamlessly integrated into a cohesive historical record.
In addition, our proposed method utilizes LLMs to iteratively generate summaries of past conversations, providing an efficient solution for managing conversation history without causing memory overflow.

Results

Our proposed method was evaluated through extensive simulations and experiments, demonstrating its effectiveness in maintaining conversation history without memory overflow. The results showed that the system can handle a large number of conversation turns while ensuring the accuracy and relevance of previous interactions.

Conclusion

In this article, we proposed a novel approach to maintaining conversation history in dialogue systems, addressing the challenge of memory overflow due to continuous storage of historical content. Our proposed method utilizes LLMs to iteratively generate summaries of past conversations, seamlessly integrating recent interactions and past summaries into a cohesive historical record. This system ensures that both recent interactions and past summaries are updated without causing memory overflow. We believe that our proposed method has significant implications for improving the performance and efficiency of conversational dialogue systems in various applications.