OpenAI Workflows
OpenAI LLM Response
This workflow demonstrates how to use the OpenAI LLM SDK to generate a response to a user's message.
Note
- Make sure that you have the
OPENAI_API_KEYenvironment variable set in preferences. - Make sure that the output mode in the
TextNode is set toLLM Content. - The exact index to access the response from a non-reasoning OpenAI response is
['output'][0]['content'][0]['text']. This is the value that you need to pass to thequery with indexnode.

OpenAI Vision Response
This workflow demonstrates how to use the OpenAI to generate a response to a user's message to use the vision capabilities of an LLM.
Note
- Make sure that you have the
OPENAI_API_KEYenvironment variable set in preferences. - Make sure that the output mode in the
TextandImageNodes is set toLLM Content. - The exact index to access the response from a non-reasoning OpenAI response is
['output'][0]['content'][0]['text']. This is the value that you need to pass to thequery with indexnode.

OpenAI Response with Schema
This workflow demonstrates how to use the OpenAI to generate a response to a user's message with a schema.
Note
- Make sure that you have the
OPENAI_API_KEYenvironment variable set in preferences. - Make sure that the output mode in the
TextNode is set toLLM Content. - The exact index to access the response from a non-reasoning OpenAI response is
['parsed_output']. This is the value that you need to pass to thequery with indexnode.

OpenAI Vision Response with Schema
This workflow demonstrates how to use the OpenAI to generate a response to a user's message with a schema.
Note
- Make sure that you have the
OPENAI_API_KEYenvironment variable set in preferences. - Make sure that the output mode in the
TextandImageNodes is set toLLM Content. - The exact index to access the response from a non-reasoning OpenAI response is
['parsed_output']. This is the value that you need to pass to thequery with indexnode.
