LiteLLM Workflows
LiteLLM LLM Response
This workflow demonstrates how to use the LiteLLM to generate a response to a user's message.
Note
- Make sure that you have the respective environment variable set in preferences.
- Make sure that the output mode in the
TextNode is set toLiteLLM Content. - The exact index to access the response from a non-reasoning OpenAI response is
['choices'][0]['message']['content']. This is the value that you need to pass to thequery with indexnode.

LiteLLM Vision Response
This workflow demonstrates how to use the LiteLLM to generate a response to a user's message to use the vision capabilities of an LLM.
Note
- Make sure that you have the respective environment variable set in preferences.
- Make sure that the output mode in the
TextandImageNodes is set toLiteLLM Content. - The exact index to access the response from a non-reasoning OpenAI response is
['choices'][0]['message']['content']. This is the value that you need to pass to thequery with indexnode.
