`structured outputs` not working for `4o-mini`
Self Checks
- [x] This is only for bug report, if you would like to ask a question, please head to Discussions.
- [x] I have searched for existing issues search for existing issues, including closed ones.
- [x] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
- [x] [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
- [x] Please do not modify this template :) and fill in all the required fields.
Dify version
1.0.0
Cloud or Self Hosted
Cloud
Steps to reproduce
- Create an LLM block
- Set model to
4o-mini - Set Response Format to
json_object - Fill out JSON Schema field
- Toggle on both Response Format and JSON Schema
- Run
Note: It doesn't matter if you select json_object or json_schema, the error will be the same.
✔️ Expected Behavior
a JSON structured output.
❌ Actual Behavior
Run failed: [openai] Error: PluginInvokeError: {"args":{"description":"[models] Error: Completions.create() got an unexpected keyword argument 'json_schema'"},"error_type":"InvokeError","message":"[models] Error: Completions.create() got an unexpected keyword argument 'json_schema'"}
The issue you're encountering with the 4o-mini model is due to its lack of support for JSON Schema functionality. According to the documentation, only gpt-4o-mini-2024-07-18 and later versions, as well as gpt-4o-2024-08-06 and later versions, support JSON Schema. The 4o-mini model likely predates these versions and does not recognize the json_schema argument, resulting in the PluginInvokeError you're seeing [1].
A similar closed issue suggested setting the response_format to json_schema instead of json_object as a solution, but this may not apply to your case if the model itself does not support JSON Schema [2].
To continue talking to Dosu, mention @dosu.
Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other
same as you. the old version doesn't show the error, but it doesn't have the json_schema-style output, just streaming text. I thought something maybe happen in OPENAI. Now, half of my projects can't work, I am trying to change the claude ;;
--update--
I try to request by curl with the json_schema method, and get the response of normal format.
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": ""
},
"finish_reason": "stop"
}
],
--update--
I try the python demo from Structured Outputs Offical and get the right format. I can't locate the error rigth now.
Same issue here.
experiencing the same issue
Same here. it works for the past months until today.
I got the similar issue using GPT-4o-2024-11-20.
For anyone who is interested working on this, please submit a pull request to https://github.com/langgenius/dify-official-plugins.
I got an answer from dify support team via mail. In my case, upgrading OpenAI plugin and reconfigure settings solve the issue.
just tested this after upgrading the OpenAI plugin, it works again
I try to upgrade the OpenAI plugin at 13:00 and doesn't work. I try again after comments with same settings and get the correct result. IDK, it's ok.
--- update ---
it's broken again ;;
same with azure openai
I have no idea why the response_format isn’t present in the model_parameters.
try to change the model id gpt-4o-2024-08-06, it's ok.
try to change the model id
gpt-4o-2024-08-06, it's ok.
We can't, that's not a solution for azure openai.
I installed latest(=v0.0.10?) OpenAI pugins but still same issue... Using GPT-4o-2024-11-20
@crazywoola This issue is related to the plugin sdk. Could you follow up on my PR? Thanks.
@medaka0213 I made a patch that overrides the superclass method. Feel free to test it if you like. https://github.com/ymshenyu/dify-official-plugins/actions/runs/13646714841
@crazywoola This issue is related to the plugin sdk. Could you follow up on my PR? Thanks.
I have forwarded already, I do not have the review access for that repo. :(
This is regression of #8451, and #9148 by @hjlarry should be re-applied.
I've opened PR for this: https://github.com/langgenius/dify-plugin-sdks/pull/57 For Azure OpenAI users, I've also opened PR which includes workaround for this: https://github.com/langgenius/dify-official-plugins/pull/521
I got same issue in dify cloud. and i can't use gpt-4o-2024-08-06. any solution is work for me?