Documentation unclear for external LiteLLM proxy with Master Key – API key not sent by Bytebot Agent
Description:
When using Bytebot with an externally hosted LiteLLM proxy, the documentation is confusing around how authentication is handled.
I followed the docs and set LITELLM_MASTER_KEY in the Bytebot agent container. However, when connecting to my external LiteLLM instance, the proxy logs show:
LiteLLM Proxy:ERROR: auth_exception_handler.py:79 - litellm.proxy.proxy_server.user_api_key_auth(): Exception occured - No api key passed in.
Even though I passed LITELLM_MASTER_KEY into the Bytebot agent, the key is not actually sent in the Authorization header to the proxy. This results in the proxy rejecting all requests with No api key passed in.
Steps to Reproduce:
- Host LiteLLM externally with a
LITELLM_MASTER_KEYdefined. - Configure Bytebot agent with:
environment: - BYTEBOT_LLM_PROXY_URL=https://my-litellm-domain.com - LITELLM_MASTER_KEY=sk-xxxx - Start agent → observe LiteLLM logs.
Expected behavior:
- Documentation should clearly state that Bytebot does not forward
LITELLM_MASTER_KEY. - Instead, users should configure Bytebot with an
OPENAI_API_KEY(and optionallyOPENAI_BASE_URL) so the agent actually sends the key in the request header.
Actual behavior:
- Bytebot silently ignores
LITELLM_MASTER_KEYand sends no key, leading to:No api key passed in.
Suggestion:
- Update the docs for the LiteLLM deployment integration to clarify:
-
LITELLM_MASTER_KEYbelongs only on the proxy side. - On the Bytebot agent side, you must use
OPENAI_API_KEY(set to the proxy master key or a virtual key generated from the proxy). - Examples in the docs should reflect the correct environment variable names for both agent and proxy.
-
after a few tests: in the proxy.service.ts file is a OpenAI part with a hardcoded apiKey. Maybe you should implement BYTEBOT_LLM_PROXY_KEY
Sure thing, we can add that, or if you've already made the change, can you throw up a PR?
I have exactly the same issue on a locally hosted LiteLLM version. The bytebot-llm-proxy accesses LiteLLM fine but the agent doesn't. I tried setting OPENAI_API_KEY to the LiteLLM masterkey as mentioned here but the same error mentioned here persists. I presume when locally hosted there is another solution but I tried initially without any LiteLLM Master key set as in the documentation and this did not seem to work either. What am I missing and is there a solution?
Hello, I can confirm this is still an issue and I've managed to implement a local fix.
As others mentioned, the bytebot-agent container ignores all environment variables for the proxy API key (like OPENAI_API_KEY or BYTEBOT_LLM_PROXY_KEY). This leads to a 401 Unauthorized error because the litellm-proxy (when configured with a database and master_key) correctly requires authentication.
The problem is that the bytebot-agent source code has at least two separate client implementations, and neither one reads the key from the environment. Both must be fixed manually.
Here is the step-by-step fix for anyone else experiencing this:
The Fix: Hardcode the API Key in the Source
You must modify the bytebot-agent source code locally and rebuild the Docker image.
1. Modify the Main OpenAI Client (proxy.service.ts)
This client handles the main /chat/completions requests. It's hardcoded with 'dummy-key-for-proxy'.
-
File:
packages/bytebot-agent/src/proxy/proxy.service.ts -
Find: The
constructorand thenew OpenAI(...)initialization.
Change this:
// Initialize OpenAI client with proxy configuration
this.openai = new OpenAI({
apiKey: 'dummy-key-for-proxy',
baseURL: proxyUrl,
});
To this (hardcoding your litellm master_key):
// Initialize OpenAI client with proxy configuration
this.openai = new OpenAI({
apiKey: 'sk-1234', // <-- Your litellm master_key
baseURL: proxyUrl,
});
2. Modify the Secondary Client (tasks.controller.ts)
This client (using fetch) seems to handle the GET /model/info request, which is what tcpdump caught. My grep output confirms this fix is in tasks.controller.js.
-
File:
packages/bytebot-agent/src/tasks/tasks.controller.ts(or similar) -
Find: The
fetchcall to/model/info.
Change this:
const response = await fetch(`${proxyUrl}/model/info`, {
method: 'GET',
headers: {
'Content-Type': 'application/json',
},
});
To this (adding the Authorization header):
const response = await fetch(`${proxyUrl}/model/info`, {
method: 'GET',
headers: {
'Content-Type': 'application/json',
'Authorization': 'Bearer sk-1234' // <-- Your litellm master_key
},
});
3. Rebuild the Agent
After saving both files, you must force docker-compose to use your local build, as it defaults to the image: tag.
- In your
docker-compose.yml, comment out theimage:line for thebytebot-agentservice:bytebot-agent: build: context: ../packages/ dockerfile: bytebot-agent/Dockerfile # image: ghcr.io/bytebot-ai/bytebot-agent:edge <-- COMMENT THIS OUT - Run a full, no-cache rebuild:
docker compose up -d --build --no-cache
After doing this, all 401 errors disappeared and the connection is stable. This confirms the agent code needs to be updated to properly read an API key from process.env.