Chat
Data Collector Node
Allows a chatbot to collect information
This node allows a chatbot to collect information by asking the user to provide specific pieces of information (e.g., name, email, etc).
Node Inputs
- Query: The query given by user
- Type:
Text
- Type:
- Field: The field to be collected
- Type:
Text
- Type:
- Description: Description of the field to be collected
- Type:
Text
- Type:
- Example: Example of the field to be collected
- Type:
Text
- Type:
If Auto Generate Questions is selected:
- Prompt: Specific instructions of how the LLM should collect the information
- Type:
Text
- Type:
Node Parameters
In the gear:
- Auto Generate Questions: If checked, the node will output questions in successive order until all fields are successfully collected. If unchecked, the node will output the data that is collected (often passed to an LLM with a prompt to ask successive questions to the user, along with specific instructions after all fields are collected) - e.g.,
{'Field1': 'Collected_Data', 'Field2': 'Collected_Data'}
- Type:
Checkbox
- Type:
If Auto Generate Questions selected:
- LLM: The model provider.
- Type:
Dropdown
- Type:
- Model: The specific model for question generation. A
- Type:
Dropdown
- Type:
Node Outputs
- Collected Data: The data collected by the node
- Type:
Text
- Example usage:
{{data_collector_0.collected_data}}
- Type:
If Auto Generate Questions is selected:
- Question: The output of the data collector
- Type:
Text
- Example usage:
{{data_collector_0.question}}
- Type:
Example
The below example is a pipeline which uses the Data Collector node to collect information from a user within a chat conversation. During the chat flow, the user will first be asked for their name. After the user provides their name, the chat flow will answer the question based on context from a knowledge base.
- Input Node: The user query
- Type:
Text
- Node Name:
Question
- Type:
- Knowledge Base Node: The context to be provided to the LLM
- Search Query:
{{Question.text}}
- Search Query:
- Data Collector Node: Collects the information from the user
- In the gear of the data collector node, turn off “Generate AI Responses”. If this is turned on, the node passes a JSON of collected data to conected nodes. Please view System prompt on the LLM below on how to prompt engineer the LLM correctly.
- Query:
{{Question.text}}
- Field:
Name
- Description:
The name of the user
- Example:
John Doe
- Chat Memory Node: Provides chat history to the LLM
- Memory Type:
Token Buffer
- Memory Type:
- LLM Node: Uses the collected information to generate a response
- System (instructions):
If you receive a Name (e.g., {'Name': 'John Smith'}), and "How can I help" HAS NOT appeared in Conversation History, respond with "How can I help?". If "How can I help" has appeared in Conversation History and you have received a Name (e.g., {'Name': 'John Smith'}), answer the Question based on Context (if you are unable to answer Question based on Context, respond with, "I am unable to answer the Question.") If you do not receive a Name (e.g., if there is no name in Name: {'Name':"}), ask the user: "What is your name?)" Please check again that you have followed all the instructions.
- Prompt:
Question: {{Question.text}} Context: {{knowledge_base_0.chunks}} Name: {{data_collector_0.collected_data}} History: {{chat_memory_0.memory}}
- System (instructions):
- Output Node: Outputs the response
- Output:
{{openai_0.response}}
- Output: