Personalized Email Generator

Scale personal outbound email

Scenario: We are a strategy consulting firm and we are doing cold outreach as part of our sales process using email templates but our response rates are low. We decide to build a pipeline that helps generate personalized email if we submit the URL of the target client’s website. The pipeline will then generate a personalized email to the company we are reaching out based on their website.

At a high level, we need to create the following pipeline components:

  1. A way for the user to input a website URL into the pipeline. The pipeline should be able to embed the contents of the website into a semantic database (a database that allows for semantic searches / queries).

  2. A way for the LLM to:

    • receive queries from the semantic database and

    • generate a sentence to be used in an email template that is personalized to information in the URL

Step 1 - Open the Pipeline Builder

Click “New” >> "Create Pipeline" within the “Pipeline” tab

Step 2 - Allow for input of a URL and for the contents of the website to be embed into a semantic database

  1. We use a Input node. The "type" of the input node is defaulted to text which is what we want (URL is text). We connect the input node to a

  2. URL loader node which convert the contents of the website into a format that a semantic database can understand. We connect the URL loader node to a

  3. Semantic search node (through the "documents" edge) which feed scraped content into a semantic search database (a database that allows for semantic based queries).

Step 3 - Connect with LLM and insert personalized sentence into email template

Connect with LLM

  1. We use the OpenAI LLM (found in the LLM tab).

  2. Within the prompt field of the LLM, we craft the prompt :

    • Company Context: We create a variable for context that we receive from the semantic search node ("{{Context}}"). We label the variable as "Company Context".

    • User Question: We create a variable for the user question that we receive from the input node ("{{Question}}"). We label the variable as "Question".

  3. Within the system field of the LLM, we craft the instructions (the system field within the OpenAI node is used to specify how the LLM should behave). We mention the following details:

    • The LLM is a personalized sentence generator for a consulting firm and it should generate a personalized sentence that explains what the consulting firm can do to help the company.

    • Instruct the LLM to limit response to 1 sentence.

    • Present the response in the first person so it fits in the email flow.

    • We give an example. This gives more specificity for the model to understand what it needs to do.

Insert personalized sentence into email template

As we are only personalizing one sentence in the email, we need to be able to feed the entire template and place it in a text node (within the "General" tab). However, the place where we want to include the personalized sentence, we include a variable "{{Personalized_Message}}". We then connect the output of the LLM node (the "response" edge) to the "Personalized_Message" handle on the Text node.

Finally, we attach an output node (from the "General" tab) to the output edge of the Text node.

Last updated