Chatflow: Cannot save LLM module prompt with Completion model (Phi3/dockerv0.6.8)
Self Checks
- [X] This is only for bug report, if you would like to ask a question, please head to Discussions.
- [X] I have searched for existing issues search for existing issues, including closed ones.
- [X] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
- [X] Please do not modify this template :) and fill in all the required fields.
Dify version
0.6.8
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
Create a chatbot with Knowledge Retrieval and use phi3 [completion] model for the LLM. Even if you type in the Prompt field, it remains showing 0 characters and the prompt is not saved when closing and reopening that module.
✔️ Expected Behavior
I was expecting the Prompt field to show the character count and have the prompt be saved when closing the module.
❌ Actual Behavior
I believe the prompt is not saved and the chatflow fails because of that.
To resolve the issue where the Prompt field does not save and shows 0 characters when using the Phi3 [completion] model in Dify version 0.6.8, self-hosted on Docker, you need to ensure that the completionPromptConfig is correctly set and updated. Here are the steps to troubleshoot and fix the issue:
-
Ensure
completionPromptConfigis Updated Correctly: ThecompletionPromptConfigmust be correctly updated with the new prompt text. Verify that thesetCurrentAdvancedPromptfunction is setting thepromptfield correctly:const setCurrentAdvancedPrompt = (prompt: PromptItem | PromptItem[], isUserChanged?: boolean) => { if (!isAdvancedPrompt) return; if (modelModeType === ModelModeType.chat) { setChatPromptConfig({ ...chatPromptConfig, prompt: prompt as PromptItem[], }); } else { setCompletionPromptConfig({ ...completionPromptConfig, prompt: prompt as PromptItem, }); } if (isUserChanged) onUserChangedPrompt(); }; -
Check
migrateToDefaultPromptFunction: Ensure that themigrateToDefaultPromptfunction correctly replaces the prompt text. This function is responsible for migrating and setting the default prompt configuration:const migrateToDefaultPrompt = async (isMigrateToCompetition?: boolean, toModelModeType?: ModelModeType) => { const mode = modelModeType; const toReplacePrePrompt = prePrompt || ''; if (!isAdvancedPrompt) { const { chat_prompt_config, completion_prompt_config, stop } = await fetchPromptTemplate({ appMode, mode, modelName, hasSetDataSet, }); if (modelModeType === ModelModeType.chat) { const newPromptConfig = produce(chat_prompt_config, (draft) => { draft.prompt = draft.prompt.map((p) => { return { ...p, text: p.text.replace(PRE_PROMPT_PLACEHOLDER_TEXT, toReplacePrePrompt), }; }); }); setChatPromptConfig(newPromptConfig); } else { const newPromptConfig = produce(completion_prompt_config, (draft) => { draft.prompt.text = draft.prompt.text.replace(PRE_PROMPT_PLACEHOLDER_TEXT, toReplacePrePrompt); }); setCompletionPromptConfig(newPromptConfig); setCompletionParams({ ...completionParams, stop, }); } return; } if (isMigrateToCompetition) { const { completion_prompt_config, chat_prompt_config, stop } = await fetchPromptTemplate({ appMode, mode: toModelModeType as ModelModeType, modelName, hasSetDataSet, }); if (toModelModeType === ModelModeType.completion) { const newPromptConfig = produce(completion_prompt_config, (draft) => { if (!completionPromptConfig.prompt?.text) draft.prompt.text = draft.prompt.text.replace(PRE_PROMPT_PLACEHOLDER_TEXT, toReplacePrePrompt); else draft.prompt.text = completionPromptConfig.prompt?.text.replace(PRE_PROMPT_PLACEHOLDER_TEXT, toReplacePrePrompt); if (['advanced-chat', 'agent-chat', 'chat'].includes(appMode) && completionPromptConfig.conversation_histories_role.assistant_prefix && completionPromptConfig.conversation_histories_role.user_prefix) draft.conversation_histories_role = completionPromptConfig.conversation_histories_role; }); setCompletionPromptConfig(newPromptConfig); if (!completionParams.stop || completionParams.stop.length === 0) { setCompletionParams({ ...completionParams, stop, }); } setStop(stop); // switch mode's params is async. It may override the stop value. } else { const newPromptConfig = produce(chat_prompt_config, (draft) => { draft.prompt = draft.prompt.map((p) => { return { ...p, text: p.text.replace(PRE_PROMPT_PLACEHOLDER_TEXT, toReplacePrePrompt), }; }); }); setChatPromptConfig(newPromptConfig); } } }; -
Check for Error Handling Mechanisms: Ensure that there are no validation errors preventing the prompt from being saved. The Dify codebase includes several error handling mechanisms, such as key validation, duplicate key checks, and prompt content validation. Make sure these checks are not causing the issue.
if (promptEmpty) { notify({ type: 'error', message: t('appDebug.otherError.promptNoBeEmpty'), duration: 3000 }); return; }
By following these steps and ensuring that the completionPromptConfig is correctly set and updated, you should be able to resolve the issue where the Prompt field does not save and shows 0 characters when using the Phi3 [completion] model in Dify version 0.6.8.
Details
Sources
- web/app/components/app/configuration/hooks/use-advanced-prompt-config.ts
- web/types/app.ts
- web/models/debug.ts
- web/app/components/app/configuration/index.tsx
- api/services/advanced_prompt_template_service.py
- en/features/prompt-engineering/prompt-template.md
- en/features/prompt-engineering/README.md
- web/app/components/base/prompt-editor/constants.tsx
- api/core/prompt/utils/prompt_template_parser.py
- web/app/components/app/configuration/config-var/index.tsx
- web/app/components/app/configuration/config-prompt/simple-prompt-input.tsx
- web/app/components/app/configuration/config/agent/prompt-editor.tsx
- docker/docker-compose.middleware.yaml
- api/config.py
- docker/docker-compose.yaml
About Dosu This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
To continue the conversation, mention @dosu.
I was able to solve the problem by switching the model to a Chat model and switching it back to a Completion model, but I am not sure that is the ideal solution.