Invalid Model: The model does not work with your current plan or api key.
MacOS Monterey, 12.6.3
I recently closed Cursor and after opening it again (I guess there was a background upgrade), when I use the chat I get this error. I'm currently using my own API Key.
Is this expected?
Now it stopped doing that... So I guess was some kind of connection error or something. Closing for now
The error is coming again, so maybe it's an intermittent issue
I'm getting it too, on both my own keys and Cursor Pro. The OpenAI API looks good, though at https://status.openai.com/
I'm getting it too, on both my own keys and Cursor Pro. The OpenAI API looks good, though at https://status.openai.com/
I bought Cursor Pro because none of my keys were working, and I thought it would fix this api key error! (but I was meaning to anyway for CP++!)
I also uninstalled and reinstalled Cursor fresh and that didn't resolve the issue.
Also, despite the "invalid model" error on the editor, I can see my API calls still being logged at my account on cursor.sh.
Are you still seeing issues? I believe these errors were a result of an incident from OpenAI that's now resolved.
No issues on my end right now
I get error when switch from Anthropic to OpenAI model. Even if Anthropic not selected or both selected but OpenAI chosen, error happens. OpenAI API key not verify in settings, and balance check text mismatch API call and model, still show Anthropic. I random turn off models, toggle Anthropic API, now work but not try switch back yet. Seem like app still not update with new Anthropic integration. Now it work.
Happening to me too now.
Happening to me too now.
Happening to me now on composer not chat
Happening to me now as well after the recent update
Happening to me now on composer not chat too
After briefly scanning the forum, I didn’t see a well-detailed report on this issue, so I’ve created one myself! If you’re experiencing the same problem, please take a look 👇🏻
https://forum.cursor.com/t/invalid-model-composer-doesnt-work-properly-with-azure-openai-key/33762
happen to me now :the model deepseek does not work with your current plan or api key,i use the deepseek model
happen to me. I'm using Deepseek Now. The composer is unavivable but I can still use chat.
Seeing this with latest claude-3.6-sonnet/thinking models
Happened to me today. I'm using Google AI studio api key.
Me too. It was solved by starting a new chat. Easy to do, but I was in the middle of a whole lot of context and had to set it all up again.
Me too. It was solved by starting a new chat. Easy to do, but I was in the middle of a whole lot of context and had to set it all up again.
Thank you!!!!
Happening to me too now.
Happening to me now aka me too! Out of the blue, was working a few minutes ago.
Ah, it's working. Guess the dirty dishes will have to wait.
Getting the same error, with new update for all anthropic models O.o
Same error, but only for Deepseek-V3/gpt-4.1-nano. The gemini-2.0-flash model works fine. Is there a bug that specifically restricts free plans from using certain models, even when using your own API key?
I received the same question. Even when using the ask mode. The model used was the deepseek-chat model.
same for me, windows,0.50, bedrock api : i guess it's because the cursor backend routes all the request to their official model handler when using a not-pro account? see details : https://forum.cursor.com/t/getting-error-model-does-not-work-with-your-current-plan-or-api-key-when-using-bedrock/94007?u=hztbuaa
Same error with the local model in llama.cpp. It was working yesterday.
I have been seeing the same message since the 25th. Starting a new chat does not solve the problem. The model is using Azure OpenAI gpt-4o. What is wrong?
Added: Checked the developer tools. Found the following log. “[composer] Error in AI response: ConnectError: [not_found] Error”.
Next, I checked the Network tab and found no logs when searching for “completions”. It may not have been sent in the first place.
Still happening on version 0.50.7