qwen setting,InternalError.Algo.InvalidParameter: Tool names are not allowed to be search?
What happened?
非常棒的开源,感谢!虽然现在还没有跑起来。
.env setting USE_CUSTOM_LLM=true
CUSTOM_LLM_PROVIDER="openai" # LLM provider CUSTOM_LLM_API_KEY="xxxxxxxxxxx" # Your LLM provider API key CUSTOM_LLM_ENDPOINT="https://dashscope.aliyuncs.com/compatible-mode/v1" # API endpoint CUSTOM_LLM_MODEL_NAME="qwen3-235b-a22b" # Model name
Optional parameters
CUSTOM_LLM_TEMPERATURE=0.7 # Temperature (default: 0) CUSTOM_LLM_MAX_TOKENS=90000 # Max tokens (default: 8192) CUSTOM_LLM_TOP_P=1 # Top P (default: 1)
正常加载,显示如下:
Authenticated via "custom-llm-api".
Using 1 GEMINI.md file and 18 MCP servers (ctrl+t to view) ╭──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮ │ > Type your message or @path/to/file │ ╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
/opt/ddd qwen3-235b-a22b(openai) (100% context left)
然后,输入请求:报错如下:
@.prompt.md,按照文件内容要求实现完整工程。 │ ╰────────────────────────────────────────────────╯
Error when talking to Gemini API Full report available at: /tmp/gemini-client-error-Turn.run-sendMessageStream-2025-07-23T08-45-53-407Z.json ✕ [API Error: 400 <400> InternalError.Algo.InvalidParameter: Tool names are not allowed to be [search]]
Error generating JSON content via API. Full report available at: /tmp/gemini-client-error-generateJson-api-2025-07-23T08-45-54-081Z.json Failed to talk to Gemini endpoint when seeing if conversation should continue. Error: Failed to generate JSON content: 400 parameter.enable_thinking must be set to false for non-streaming calls at GeminiClient.generateJson (file:///root/.nvm/versions/node/v20.19.3/lib/node_modules/easy-llm-cli/bundle/gemini.js:251360:13) at async checkNextSpeaker (file:///root/.nvm/versions/node/v20.19.3/lib/node_modules/easy-llm-cli/bundle/gemini.js:249578:28) at async GeminiClient.sendMessageStream (file:///root/.nvm/versions/node/v20.19.3/lib/node_modules/easy-llm-cli/bundle/gemini.js:251306:32) at async file:///root/.nvm/versions/node/v20.19.3/lib/node_modules/easy-llm-cli/bundle/gemini.js:258998:24 at async file:///root/.nvm/versions/node/v20.19.3/lib/node_modules/easy-llm-cli/bundle/gemini.js:259065:34
如果是/auth 登录后,执行请求,也是报错如下:
Waiting for authentication... Authenticated via "oauth-personal".
╭────────────────────────────────────────────────╮ │ > @.prompt.md,按照文件内容要求实现完整工程。 │ ╰────────────────────────────────────────────────╯
Error when talking to Gemini API Full report available at: /tmp/gemini-client-error-Turn.run-sendMessageStream-2025-07-23T08-51-53-697Z.json ✕ [API Error: [{ "error": { "code": 404, "message": "Requested entity was not found.", "errors": [ { "message": "Requested entity was not found.", "domain": "global", "reason": "notFound" } ], "status": "NOT_FOUND" } } ]]
What did you expect to happen?
期望,正常执行请求。
Client information
$ gemini /about
Using: 1 GEMINI.md File | 18 MCP Servers (ctrl+t to view)
╭──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ > Type your message or @path/to/file │
╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
/opt/ddd no sandbox (see /docs) gemini-2.5-pro (100% context left)
Login information
No response
Anything else we need to know?
No response
去掉所有mcp中包括search的服务,可以跑起来, 然后又会报token限制,本来qwen的max_token9万,然而限制在1万, 估计这两处需要修改默认配置或者默认代码。