The variable cannot be selected in the configuration box.
Self Checks
- [X] This is only for bug report, if you would like to ask a quesion, please head to Discussions.
- [X] I have searched for existing issues search for existing issues, including closed ones.
- [X] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
- [X] Pleas do not modify this template :) and fill in all the required fields.
Dify version
0.6.4
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
When configuring the workflow, I want to use variables in the LLM module, but I am unable to select variables in the configuration box.
https://github.com/langgenius/dify/assets/146314582/791adbdb-3b89-4eef-9821-67616ed8e57f
✔️ Expected Behavior
No response
❌ Actual Behavior
No response
It appears you're encountering difficulties with selecting variables in the configuration box for the LLM module in Dify version 0.6.4, specifically within a self-hosted Docker environment. This issue might stem from the process of manually configuring the LLM node, rather than a limitation or bug within the system itself.
To address this, ensure you've properly set up the model configuration under "System Settings—Model Provider" as outlined in the Dify documentation. When composing prompts within an LLM node, you should manually insert variables by using the variable insertion menu, which can be accessed by typing "/" or "{". This step is crucial for inserting variables from preceding flow nodes into the prompts as context content, enabling the desired workflow configuration.
For detailed guidance on setting up an LLM node, including the insertion of variables into prompts, refer to the Dify documentation and the example configurations found in the Dify repository.
This approach should help you overcome the issue with variable selection in the configuration box. If you continue to face difficulties or have further questions, please don't hesitate to reach out for more assistance.
Sources
- en/features/workflow/node/llm.md
- en/features/workflow/nodes/llm.md
- api/tests/integration_tests/workflow/nodes/test_llm.py
- api/core/model_runtime/model_providers/openai_api_compatible/llm/llm.py
- api/core/model_runtime/model_providers/openai/llm/llm.py
- web/app/components/app/configuration/config-var/select-var-type.tsx
About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
@wurenxiaozu I can't reproduction this problem. Can you provide more detail to reproduction? Provide the app DSL?
app: description: '' icon: "\U0001F916" icon_background: '#FFEAD5' mode: workflow name: "\u5546\u4E1A\u7B56\u7565AI\u63A8\u8350" workflow: features: file_upload: image: enabled: false number_limits: 3 transfer_methods: - remote_url - local_file opening_statement: '' retriever_resource: enabled: false sensitive_word_avoidance: configs: [] enabled: false type: '' speech_to_text: enabled: false suggested_questions: [] suggested_questions_after_answer: enabled: false text_to_speech: enabled: false language: '' voice: '' graph: edges: - data: sourceType: start targetType: llm id: start-1713368021906 source: start sourceHandle: source target: '1713368021906' targetHandle: target type: custom - data: sourceType: llm targetType: end id: 1713368021906-end source: '1713368021906' sourceHandle: source target: end targetHandle: target type: custom nodes: - data: selected: false title: START type: start variables: - default: '' description: null hint: null label: "\u76EE\u6807\u540D\u79F0" max_length: 48 options: null required: true type: text-input variable: mbmc - default: '' description: null hint: null label: "\u5F00\u59CB\u65F6\u95F4" max_length: 48 options: null required: true type: text-input variable: kssj - default: '' description: null hint: null label: "\u5B8C\u6210\u65E5\u671F" max_length: 48 options: null required: true type: text-input variable: wcsj - default: '' description: null hint: null label: "\u8861\u91CF\u6307\u6807" max_length: 48 options: null required: true type: text-input variable: hlzb - default: '' description: null hint: null label: "\u6307\u6807\u76EE\u6807\u503C" max_length: 48 options: null required: true type: text-input variable: zbmbz - default: '' description: null hint: null label: "\u6307\u6807\u4E0E\u76EE\u6807\u503C\u5173\u7CFB" max_length: 48 options: null required: true type: text-input variable: zbgx - default: '' description: null hint: null label: "\u76EE\u6807\u503C\u5355\u4F4D" max_length: 48 options: null required: true type: text-input variable: mbzdw - default: '' description: null hint: null label: "\u8D23\u4EFB\u90E8\u95E8" max_length: 48 options: null required: true type: text-input variable: zrbm - default: '' description: null hint: null label: "\u4E1A\u6001" max_length: null options: null required: true type: text-input variable: yt height: 298 id: start position: x: 80 y: 392 positionAbsolute: x: 80 y: 392 selected: false type: custom width: 244 - data: outputs: - value_selector: - llm - text variable: result selected: false title: END type: end height: 90 id: end position: x: 794.0000000000002 y: 451.42857142857144 positionAbsolute: x: 794.0000000000002 y: 451.42857142857144 selected: false type: custom width: 244 - data: context: enabled: false variable_selector: [] desc: '' model: completion_params: frequency_penalty: 0 max_tokens: 1024 presence_penalty: 0 temperature: 0.7 top_p: 1 mode: chat name: ernie-3.5-4k-0205 provider: wenxin prompt_template: - role: system text: "\u65E0\u6CD5\u9009\u62E9\u53D8\u91CF" selected: false title: LLM 2 type: llm variables: [] vision: enabled: false height: 98 id: '1713368021906' position: x: 369.71428571428567 y: 456.2857142857143 positionAbsolute: x: 369.71428571428567 y: 456.2857142857143 selected: true sourcePosition: right targetPosition: left type: custom width: 244 viewport: x: -1.1368683772161603e-13 y: -5.684341886080802e-14 zoom: 0.7
chrome?
@wurenxiaozu You should attach the exported file. And since you didn't use the code block to format the content. It loses it's original indentation.
I sometime face the same problem.
Okk I looked into this issue. The issue where touchpad clicks on the laptop are sometimes not recognized. But this problem does not occur when the user uses a mouse (because moues clicks are recognized) This issue could potentially be resolved by utilizing the onTouchStart, onTouchEnd, or onChange events in React. The code where it can be improved is mentioned below.
plugins/component-picker-block/index.tsx
<PromptMenu
startIndex={0}
selectedIndex={selectedIndex}
options={promptOptions}
onClick={(index, option) => {
if (option.disabled)
return
setHighlightedIndex(index)
selectOptionAndCleanUp(option)
}}
onMouseEnter={(index, option) => {
if (option.disabled)
return
setHighlightedIndex(index)
}}
/>
If it is needed to be solved then let me know @crazywoola I will create a pr with the changes.
@wurenxiaozu I encountered the same issue myself. I resolved it by quickly double-clicking the mouse twice in succession, which worked effectively. I'm not sure what causes this, but if you follow my method, it might retrieve the variable for you.
Okk I looked into this issue. The issue where touchpad clicks on the laptop are sometimes not recognized. But this problem does not occur when the user uses a mouse (because moues clicks are recognized) This issue could potentially be resolved by utilizing the onTouchStart, onTouchEnd, or onChange events in React. The code where it can be improved is mentioned below.
plugins/component-picker-block/index.tsx<PromptMenu startIndex={0} selectedIndex={selectedIndex} options={promptOptions} onClick={(index, option) => { if (option.disabled) return setHighlightedIndex(index) selectOptionAndCleanUp(option) }} onMouseEnter={(index, option) => { if (option.disabled) return setHighlightedIndex(index) }} />If it is needed to be solved then let me know @crazywoola I will create a pr with the changes.
Yes, this is the phenomenon. Using the mouse is fine, but using the laptop is a problem. I’m currently using the mouse and everything is normal. Previously, due to a business trip, I had to use the laptop’s touchpad.
@chiragksharma Thanks for the feedback and potential solution. We are welcome the PR as well :)