[BUG] Using Azure OpenAI LLM produces error in result parsing
Describe the bug
Using the Azure OpenAI in simple chat scenarios produces an code-level error which is surfaces as TypeError: Cannot read properties of undefined (reading '0') in the chat as well.
At least while the chat box is still visible the actual result is displayed afterwards as (separate?) message. Note that the latter often seems pretty weird as well but this may be unrelated to this error (see screenshots)
After closing and reopening the chat window, only the error message is visible.
TypeError: Cannot read properties of undefined (reading '0')
at NoOpOutputParser.parseResult (/usr/local/lib/node_modules/flowise/node_modules/langchain/dist/schema/output_parser.cjs:24:38)
at NoOpOutputParser.parseResultWithPrompt (/usr/local/lib/node_modules/flowise/node_modules/langchain/dist/schema/output_parser.cjs:7:21)
at LLMChain._getFinalOutput (/usr/local/lib/node_modules/flowise/node_modules/langchain/dist/chains/llm_chain.cjs:93:55)
at LLMChain._call (/usr/local/lib/node_modules/flowise/node_modules/langchain/dist/chains/llm_chain.cjs:123:42)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async LLMChain.call (/usr/local/lib/node_modules/flowise/node_modules/langchain/dist/chains/base.cjs:98:28)
at async runPrediction (/usr/local/lib/node_modules/flowise/node_modules/flowise-components/dist/nodes/chains/LLMChain/LLMChain.js:128:29)
at async LLMChain_Chains.run (/usr/local/lib/node_modules/flowise/node_modules/flowise-components/dist/nodes/chains/LLMChain/LLMChain.js:77:21)
at async App.processPrediction (/usr/local/lib/node_modules/flowise/dist/index.js:783:19)
at async /usr/local/lib/node_modules/flowise/dist/index.js:558:13
To Reproduce Steps to reproduce the behavior:
- Create a new chat flow from template "Simple LLM Chain"
- Replace OpenAI LLM with Azure OpenAI
- Configure access credentials for LLM as per docs
- Save and deploy
- See error
Expected behavior
A single clear reply from the LLM in the chatbox as response.
Screenshots
The Azure Model is configures as follows (API instance name is dummy here)
The simple chat scenario with the error in the result looks as follows:
Flow If applicable, add exported flow in order to help replicating the problem.
Setup
- Installation: Docker container with
/bin/sh -c flowise start - Flowise Version 1.3.4 (docker image flowiseai/flowise:1.3.4)
- OS: Linux
- Browser: Chrome
which model did you deploy on Azure? Older deprecated models might not work. Here's the docs - https://docs.flowiseai.com/chat-models/azure-chatopenai
I tried both with deployments of 'gpt-35-turbo' as well as 'gpt-4'.
Still don't works
Does anyone has a working setup to use Azure OpenAI in Flowise ?