GitLab Duo /models selection still responds as Claude 3.5 Sonnet
Description
When using GitLab Duo (OAuth login) and selecting a Duo model via /models (e.g., gitlab/duo-chat-opus-4-5), the chat responses still identify as Claude 3.5 Sonnet. It looks like the GitLab provider doesn’t map the selected duo-chat-* model to the underlying Anthropic model, and also doesn’t pass through provider.gitlab.options.
Plugins
No response
OpenCode version
1.1.19
Steps to reproduce
- Authenticate with GitLab via OAuth in OpenCode
- Open TUI and run
/models - Select
gitlab/duo-chat-opus-4-5(orduo-chat-sonnet-4-5) - Send a prompt
- The assistant responds identifying as Claude 3.5 Sonnet instead of the selected Duo 4.5 model
Expected behavior The response should match the selected Duo model (e.g., Claude Opus 4.5 or Sonnet 4.5).
Possible root cause
packages/opencode/src/provider/provider.ts GitLab loader calls:
sdk.agenticChat(modelID, { anthropicModel, featureFlags })
But:
-
provider.gitlab.optionsis not merged into options (only featureFlags are) - there is no mapping from
duo-chat-opus-4-5→claude-opus-4-5(and similar)
So anthropicModel stays undefined and the GitLab SDK likely defaults to Claude 3.5 Sonnet.
Screenshot and/or share link
No response
Operating System
Ubuntu 24.04
Terminal
Windows Terminal
This issue might be a duplicate of existing issues. Please check:
- #5674: Custom OpenAI-compatible provider options not being passed to API calls (same root cause pattern - provider options not being merged)
- #7455: Add GitLab Duo Agentic Chat Provider Support (related GitLab provider feature)
The core issue is that provider.gitlab.options are not being merged into the options passed to sdk.agenticChat(), similar to the problem documented in #5674 where provider options fail to propagate to API calls.
Feel free to ignore if none of these address your specific case.
You cannot rely on models self identifying, you can check the http request by setting a proxy im sure it's being sent correctly.
The only reason u can do this in anthropic tools is because they write in the system prompt "You are
We will do this because people keep getting confused about it
@harkaranbrar7 You are absolutely right. Since you cannot rely on model's self identification, for me it says
I don't have access to information about which specific model version is being used for our current conversation. I'm Claude, made by Anthropic, but I cannot determine the exact model ID (like claude-sonnet-4, claude-opus, etc.) that's running this session.
It seems like we've lost models options while migrating from plain json models list to models.dev listing. We had in our POC
"duo-chat-sonnet-4-5": {
"id": "duo-chat-sonnet-4-5",
"name": "Agentic Chat (Claude Sonnet 4.5)",
"release_date": "2024-01-01",
"attachment": false,
"reasoning": false,
"temperature": true,
"tool_call": true,
"cost": {
"input": 0,
"output": 0,
"cache_read": 0,
"cache_write": 0
},
"options": {
"anthropicModel": "claude-sonnet-4-5-20250929"
},
"limit": {
"context": 200000,
"output": 4096
},
"modalities": {
"input": [
"text"
],
"output": [
"text"
]
}
},
Versus what we get from models.dev
"duo-chat-sonnet-4-5": {
"id": "duo-chat-sonnet-4-5",
"name": "Agentic Chat (Claude Sonnet 4.5)",
"family": "claude-sonnet",
"attachment": false,
"reasoning": false,
"tool_call": true,
"temperature": true,
"release_date": "2026-01-08",
"last_updated": "2026-01-08",
"modalities": {
"input": [
"text"
],
"output": [
"text"
]
},
"open_weights": false,
"cost": {
"input": 0,
"output": 0,
"cache_read": 0,
"cache_write": 0
},
"limit": {
"context": 200000,
"output": 4096
}
},
As you see, options are missing now :(
So it seems, it's all routed to the default for gitlab-ai-provider model, which is claude-sonnet-4-5-20250929
Let's me figure out how to pass a proper model id to the provider 🤔
@rekram1-node PR to fix this: https://github.com/anomalyco/opencode/pull/8424
@vglafirov sorry, it's an hs, but can i DM you? I work in a company and i got a 403 trying to use opencode with gitlab duo. Does the company need to check an other option to be able to use opencode with duo?
@gaetan-puleo Unfortunately, I'm not authorized to speak directly with customers due to company policy — you'll need to contact support for official assistance. However, I can help debug here.
A 403 error means "Forbidden" (not authorized to access the resource). A few questions:
- Are you using SaaS or a self-hosted instance?
- Is Duo enabled for your project/group/organization?
- Are you authenticating with a PAT or OAuth?
- Does Duo work for you in the VS Code extension?
If Duo works in VS Code, it should work in OpenCode as well, since they use the same API endpoints.
- Self hosted
- Yes
- PAT
- Yes it works in VS Code Extension :)
I got an error "GitlabError: Failed to get direct access token: 403 Forbidden - {"Message":"403 Forbidden"}" to be more precise @vglafirov
Thank you anyway
@gaetan-puleo Interesting 🤔 Since this integration is brand new (literally merged yesterday), I haven't had a chance to test it on self-hosted instances yet. Would you mind opening a support ticket and mentioning me ([email protected])? I'll try to get directly involved—I'm keen to get this working for self-hosted as well.
I'll check if I can do that, I am not an official employee (I am a external contributor for a big company) I'll tell you more about it, and I'll ping you in a support ticket if I can :).
@vglafirov Thank you for your work and your contributions to OpenCode <3
@gaetan-puleo I checked the provider code — it uses the https://cloud.gitlab.com/ai/v1/proxy/anthropic/ endpoint to access the models. This likely won't work for self-hosted instances. I need to investigate further and compare how the VS Code extension handles this. I'll keep you posted.
I can confirm oauth and PAT token is working with SAAS gitlab hosted instance.
@harkaranbrar7 Yeah I have a personal account too and it works well.
@harkaranbrar7 BTW, my fix has been merged. Next version v1.1.21 should properly handle model selection now. Thanks for reporting this. That said, models don't have self-awareness anyway; they always return a random model version :) But you can differentiate Haiku from Opus by TPS. Haiku is far more performant, but dumber.
@vglafirov Thanks for the fix, Does gitlab only supports 3 models in public api ? I can see OpenAI models they are offering in UI and Vscode Extension
@harkaranbrar7 The plan is to support all models available in VS Code, but we decided to start with Anthropic first. I'll work on the other models in the coming weeks. Stay tuned.
Since v1.1.21 has been published and contains the fix, this issue can be closed now.
Since
v1.1.21has been published and contains the fix, this issue can be closed now.
I'll create a new issue for the Self Hosted gitlab instances.