Variable Parsing when in LLM Prompts
Self Checks
- [X] This is only for bug report, if you would like to ask a question, please head to Discussions.
- [X] I have searched for existing issues search for existing issues, including closed ones.
- [X] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
- [X] [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
- [X] Please do not modify this template :) and fill in all the required fields.
Dify version
0.14.2
Cloud or Self Hosted
Cloud, Self Hosted (Docker), Self Hosted (Source)
Steps to reproduce
when assigning a converastion variable {{#conversation.var1#}} into antother variable,
the use of this variable into any LLM Node Prompte will consider the inner variables as string and not parsing them into variables.
✔️ Expected Behavior
when a variable is used inside a string, stored in another variable or passed within the workflow. it must be parsed as a variabe a not as a string
❌ Actual Behavior
variables in text {{#conversation.testVar#}} or /Context are considered as string, in node LLM prompts
To ensure that variables used inside a string in LLM prompts are parsed as variables and not as strings when using Dify version 0.14.2, you should use the '{{}}' format for variable interpolation. For example, a variable like '/name' should be written as '{{name}}' to be properly recognized and processed as a variable reference [1].
Additionally, Dify supports the use of the Jinja-2 template language for more complex prompts, which allows for lightweight data transformation and logical processing. You can also call out the variable insertion menu by typing / or { to insert special variable blocks or upstream node variables into the prompt as context content [2].
To continue talking to Dosu, mention @dosu.
Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other
i mean i have a variable named var_ONE : "here is the value {{conversation.key}}"
then in an LLM node, i use the variable var_ONE as a prompte.
this will not parse the variable {{conversation.key}} and get it's value, instead, it's just written then as a string
Hi, @Namec999. I'm Dosu, and I'm helping the Dify team manage their backlog. I'm marking this issue as stale.
Issue Summary:
- You reported a bug where variables in LLM prompts are not parsed correctly and are treated as strings.
- I suggested using '{{}}' for variable interpolation and mentioned Jinja-2 template support.
- You clarified with an example that a variable containing another variable reference is not parsed as expected.
Next Steps:
- Please confirm if this issue is still relevant to the latest version of the Dify repository by commenting here.
- If there is no further activity, this issue will be automatically closed in 15 days.
Thank you for your understanding and contribution!
I encountered the same problem in version 1.0.0-beta.1.
Hi, @Namec999. I'm Dosu, and I'm helping the Dify team manage their backlog. I'm marking this issue as stale.
Issue Summary
- Reported bug in Dify version 0.14.2: variables in LLM prompts are treated as strings.
- Suggested workaround: use '{{}}' for variable interpolation and Jinja-2 templates for complex prompts.
- You clarified that nested variable references are not parsed correctly.
- Ena-0606 confirmed the issue persists in version 1.0.0-beta.1.
Next Steps
- Please confirm if this issue is still relevant to the latest version of Dify. Comment to keep the discussion open.
- If no further activity occurs, this issue will be automatically closed in 15 days.
Thank you for your understanding and contribution!