Chat
Chat Memory Node
Retrieve previous conversation history
The chat memory takes the past conversation history and passes it to connected nodes. Chat memory nodes are often connected to LLM nodes, as LLMs do not inherently have memory. This allows the LLM to reference previous conversation history.
Node Inputs
The chat memory node does not have any inputs.
Node Parameters
- Memory Type: select the memory type
- Type:
Dropdown
- Type:
Overview of Memory Types
- Full - Formatted: Returns all previous chat history, with user messages prepended with “Human” and output messages prepended with “AI”. Useful for handling short conversations, but may run into token limits once conversations start becoming longer.
- Full - Raw: Returns a Python list with elements in the following format:
{"type": type, "message": message}
, where type is “input” (i.e., Human) or “output” (i.e., AI) and message contains the contents of the message. Useful for developers looking to execute custom logic with their chat history using Transformations. - Message Buffer: Returns a set number of previous consecutive messages. This number defaults to 10 and can be changed by right-clicking the Chat Memory node and clicking “Details”. Both Human messages and AI messages count toward the limit.
- Token Buffer: Returns previous consecutive messages until adding an additional message would cause the total history size to be larger than the Max Tokens. The default number of Max Tokens is 2048 and can be changed by right-clicking the Chat Memory node and clicking “Details”.
Node Outputs
- Memory: The conversation history in the selected memory type
- Type:
Text
- Example usage:
{{chat_memory_0.memory}}
- Type:
Example
The below example is a simple chatbot pipeline. Chat memory is connected to the LLM node to pass the conversation history to the LLM with every run. In the LLM’s system prompt, it is prompted to “Use conversation history when relevant”.