[Bug]: Craw4AI always falls back to OpenAI API despite using local Ollama provider
crawl4ai version
0.6.3
Expected Behavior
llm_config = LLMConfig( provider="ollama/llama3", base_url="http://localhost:11434"
)
llm_strat = LLMExtractionStrategy(
config = llm_config....)
When configuring Craw4AI to use a local LLM provider (e.g., ollama/llama3 with base_url="http://localhost:11434"), the system still attempts to use the OpenAI API key and throws authentication errors (401 invalid API key).
Current Behavior
21:32:03 - LiteLLM:DEBUG: main.py:5264 - openai.py: Received openai error - The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable 21:32:03 - LiteLLM:DEBUG: utils.py:337 - RAW RESPONSE: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
Is this reproducible?
Yes
Inputs Causing the Bug
Steps to Reproduce
Code snippets
OS
Windows 11
Python version
3.11.11
Browser
No response
Browser version
No response
Error logs & Screenshots (if applicable)
No response
I too get the same error when trying with ollama model. craw4ai is not working for other LLM providers except OpenAI
Exactly, we need an option to skip authentication for Ollama with custom endpoint.
https://github.com/unclecode/crawl4ai/blob/b4bb0ccea075e58d732380cfbcf07c22703c7153/deploy/docker/api.py#L94
I have the same issue, it defaults to check for OPENAI keys, even happens with groq.
I have the same issue, it defaults to check for OPENAI keys, even happens with groq. try setting ollama in key with http://localhost:11434/v1 as an endpoint
https://ollama.com/blog/openai-compatibility
have you found the solution to the issues?
hello @viphoangdep @imad07mos @simonnxren upgrade
our latest release v0.7.7 and please test it and let us know