Flowise icon indicating copy to clipboard operation
Flowise copied to clipboard

[BUG] Using Azure OpenAI LLM produces error in result parsing

Open sebbae opened this issue 2 years ago • 5 comments

Describe the bug Using the Azure OpenAI in simple chat scenarios produces an code-level error which is surfaces as TypeError: Cannot read properties of undefined (reading '0') in the chat as well.

At least while the chat box is still visible the actual result is displayed afterwards as (separate?) message. Note that the latter often seems pretty weird as well but this may be unrelated to this error (see screenshots)

After closing and reopening the chat window, only the error message is visible.

TypeError: Cannot read properties of undefined (reading '0')
    at NoOpOutputParser.parseResult (/usr/local/lib/node_modules/flowise/node_modules/langchain/dist/schema/output_parser.cjs:24:38)
    at NoOpOutputParser.parseResultWithPrompt (/usr/local/lib/node_modules/flowise/node_modules/langchain/dist/schema/output_parser.cjs:7:21)
    at LLMChain._getFinalOutput (/usr/local/lib/node_modules/flowise/node_modules/langchain/dist/chains/llm_chain.cjs:93:55)
    at LLMChain._call (/usr/local/lib/node_modules/flowise/node_modules/langchain/dist/chains/llm_chain.cjs:123:42)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async LLMChain.call (/usr/local/lib/node_modules/flowise/node_modules/langchain/dist/chains/base.cjs:98:28)
    at async runPrediction (/usr/local/lib/node_modules/flowise/node_modules/flowise-components/dist/nodes/chains/LLMChain/LLMChain.js:128:29)
    at async LLMChain_Chains.run (/usr/local/lib/node_modules/flowise/node_modules/flowise-components/dist/nodes/chains/LLMChain/LLMChain.js:77:21)
    at async App.processPrediction (/usr/local/lib/node_modules/flowise/dist/index.js:783:19)
    at async /usr/local/lib/node_modules/flowise/dist/index.js:558:13

To Reproduce Steps to reproduce the behavior:

  1. Create a new chat flow from template "Simple LLM Chain"
  2. Replace OpenAI LLM with Azure OpenAI
  3. Configure access credentials for LLM as per docs
  4. Save and deploy
  5. See error

Expected behavior

A single clear reply from the LLM in the chatbox as response.

Screenshots

The Azure Model is configures as follows (API instance name is dummy here)

Screenshot of API access credentials

The simple chat scenario with the error in the result looks as follows:

Screenshot of weird response

Flow If applicable, add exported flow in order to help replicating the problem.

Setup

  • Installation: Docker container with /bin/sh -c flowise start
  • Flowise Version 1.3.4 (docker image flowiseai/flowise:1.3.4)
  • OS: Linux
  • Browser: Chrome

sebbae avatar Sep 18 '23 08:09 sebbae

which model did you deploy on Azure? Older deprecated models might not work. Here's the docs - https://docs.flowiseai.com/chat-models/azure-chatopenai

HenryHengZJ avatar Sep 22 '23 15:09 HenryHengZJ

I tried both with deployments of 'gpt-35-turbo' as well as 'gpt-4'.

sebbae avatar Sep 25 '23 12:09 sebbae

Still don't works

emil2intra avatar Nov 01 '23 02:11 emil2intra

image

emil2intra avatar Nov 01 '23 03:11 emil2intra

Screenshot 2024-05-12 172600 Does anyone has a working setup to use Azure OpenAI in Flowise ?

i-LUDUS avatar May 12 '24 15:05 i-LUDUS