URGENT!: Copilot has mistakes in calculating Premium quotas
Since GitHub Copilot doesn't support Grok Code Fast 1, I add it from OpenRouter. After some use, Copilot says my premium request allowance is exceeded.
In fact, my allowance is only used 46.3%. I think maybe Copilot for Xcode have calculated the use of OpenRouter Grok Code Fast 1 to its own, though it is not support in Xcode.
Describe the bug
Versions
- Copilot for Xcode: 0.43.140
- Xcode: 26.0
- macOS: 26.0
Steps to reproduce
- In Settings, model, add OpenRounter
- Choose Grok 4 fast, and Grok Code Fast 1.
- Use Grok Code Fast 1
Screenshots
Logs
Additional context
Please fix this ASAP. I can't use any premium models now.
Thanks for flagging this. We’ve confirmed the bug in quota calculation for premium models.
Next steps:
A patch file will be created to correct quota miscounts for premium requests.
The fix will be applied upstream and included in the next release.
In the meantime, users can continue with standard (non-premium) models without interruption.
We’ll update here again as soon as the patch is merged.
S/MINE
Hi @owenzhao, thank you for raising this. It appears that your OpenRouter quota may have been exhausted or exceeded under this condition—could you kindly help confirm? The error message currently displayed is misleading, as it incorrectly references premium model requests when using the inbuilt model. We will fix this issue on our end to ensure the messaging is more accurate and user-friendly ASAP. Please update to new pre-release versions when available.
Hi @owenzhao , sorry for the misleading error message. We directly forward your requests to the model provider endpoint, and these requests will NOT be counted into your premium request number in any case. The real premium request quota used is shown in the quota section of the menu.
Thank you for your replies. The message is not just misleading, it does prevent me to use premium model like GPT-5, as it told me that it would force me to use GPT-4.1 if I choose none free models.
I did find a workaround. Quit and reopen copilot can get rid of the forcing model use.
As a programmer, I think this should be fixed when you find the error and just not only compare the model name, but also the provider as well.
I can understand this as the model providers are new features that introduced recently. So previous code may not consider it.
Hi @owenzhao , do you mean that you cannot use the premium models provided by the GitHub Copilot? Can you still reproduce the error after you reopen the Copilot?
When this CLS Error shows in Menu, I can choose the premium model, but I can't use it. Copilot will prevent me and show this and force me use GPT-4.1.
The workaround is to quit Copilot and reopen. The CLS error will not longer in the menu and I can choose the premium models.
Thanks for the clear explanation, and we will fix it soon.