dify icon indicating copy to clipboard operation
dify copied to clipboard

How to integrate a model interface compatible with OpenAI specifications into a system model

Open peterliang5678 opened this issue 10 months ago • 1 comments

Self Checks

  • [x] I have searched for existing issues search for existing issues, including closed ones.
  • [x] I confirm that I am using English to submit report (我已阅读并同意 Language Policy).
  • [x] [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • [x] Please do not modify this template :) and fill in all the required fields.

Provide a description of requested docs changes

How to integrate a model interface compatible with OpenAI specifications into a system model

peterliang5678 avatar Mar 19 '25 13:03 peterliang5678

To integrate a model interface compatible with OpenAI specifications into a system model, you can follow these steps:

  1. Create Module Structures by Model Type: Organize your code by creating sub-modules under the provider module based on model types, such as llm or text_embedding. This helps in maintaining and extending the codebase.

  2. Write Model Request Code: Develop a Python file named after the model type (e.g., llm.py) within the corresponding module. Implement a class that adheres to the system's model interface specifications. For instance, extend the OAIAPICompatLargeLanguageModel class to create a custom model like StepfunLargeLanguageModel, which includes methods for model invocation and credential validation.

  3. Add Predefined Model Configuration: If using predefined models, create YAML files for each model (e.g., claude-3.5.yaml). These files should follow the AIModelEntity specifications, detailing model parameters and functionalities. For OpenAI, predefined models like gpt-3.5-turbo-0125 and gpt-4o-2024-05-13 can be used with unified provider credentials.

  4. Test Plugin: Ensure the new provider functionality works as expected by writing unit and integration tests.

  5. Integrate with LocalAI for Local Deployment: For local deployment, consider using LocalAI, which is compatible with OpenAI API specifications. Set up LocalAI, download compatible models, and configure them. Start LocalAI using Docker or another method, and configure your system to use these models.

  6. Adding a New Model Provider: To add a new provider, create a YAML file in the provider directory to describe the provider's information and parameter configuration. Implement a Python class that connects with the provider's API and implements core functionality.

  7. Use Dify for OpenAI SDK Compatibility: Dify's API will be compatible with the OpenAI SDK in version 1.0.0 through plugins like oaicompat_dify_model and oaicompat_dify_app, allowing integration with OpenAI SDK-compatible locations without separate development for each integration point [1][2][3][4].

These steps should help you integrate a model interface compatible with OpenAI specifications into your system model effectively.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other

dosubot[bot] avatar Mar 19 '25 13:03 dosubot[bot]

  • https://marketplace.dify.ai/plugins/langgenius/oaicompat_dify_model
  • https://marketplace.dify.ai/plugins/langgenius/openai_api_compatible

Try above

crazywoola avatar Mar 19 '25 16:03 crazywoola