Could not intergrate LiteLLM into Dify via OpenAI-API-Compatible
Self Checks
- [X] This is only for bug report, if you would like to ask a question, please head to Discussions.
- [X] I have searched for existing issues search for existing issues, including closed ones.
- [X] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
- [X] Please do not modify this template :) and fill in all the required fields.
Dify version
0.6.8
Cloud or Self Hosted
Cloud
Steps to reproduce
I use OpenAI-API-Compatible to add my local Lite LLM under Base_URL = http://localhost:4000 ; LiteLLM API key, Model_name (masked, proxy) by LiteLLM. The function of LiteLLM worked well with curl and python test method. But I cannot add local LiteLLM into Dify.
✔️ Expected Behavior
I expect LiteLLM can work well with Dify.
❌ Actual Behavior
The error: An error occurred during credentials validation: HTTPConnectionPool(host='localhost', port=4000): Max retries exceeded with url: /chat/completions (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f966023a4d0>: Failed to establish a new connection: [Errno 111] Connection refused'))
To resolve the issue where Dify cannot connect to LiteLLM running on localhost, follow these steps:
-
Change LiteLLM Binding: Ensure LiteLLM is bound to
0.0.0.0instead oflocalhostto make it accessible over the network. -
Update Base URL in Dify: If Dify is running in Docker and LiteLLM is on the host, use
http://host.docker.internal:4000as the Base URL instead oflocalhost. -
Test Network Accessibility: From within the Dify Docker container, use
curlto test connectivity to the LiteLLM service. -
Restart Both Services: After making these adjustments, restart LiteLLM and Dify.
This approach addresses the common issue where Docker containers cannot directly access services on the host using localhost. Adjusting the service binding and using host.docker.internal for Docker setups should help Dify connect to LiteLLM.
Details
Sources
- ollama baseURL error (connect timeout=10)
- An error occurred during credentials validation
- OpenAI-API-compatible 添加第三方API失败!
- Cannot connect to xinference
- HTTPConnectionPool Max retries exceeded, Failed to establish a new connection: [Errno 111] Connection refused'))
- Cannot Add openai-api-compatible model
- en/tutorials/model-configuration/ollama.md
- api/tests/integration_tests/model_runtime/openai_api_compatible/test_llm.py
- api/core/model_runtime/model_providers/openai_api_compatible/openai_api_compatible.yaml
About Dosu This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
To continue the conversation, mention @dosu.
I confirmed that your help is working.