Utilize LLMs in your pipelines
Text
{{
to open the variable builder.
Text
Text
(or Stream<Text>
if streaming is enabled){{openai_0.response}}
Integer
{{openai_0.tokens_used}}
Integer
{{openai_0.input_tokens}}
Integer
{{openai_0.output_tokens}}
Decimal
{{openai_0.credits_used}}
{{input_0.text}}
Answer the question based on the context.
Question: {{input_0.text}} Context: {{knowledge_base_1.chunks}}
{{openai_0.response}}