Flemming Bakkensen
Results
3
comments of
Flemming Bakkensen
I have the same issue after CC have executed some python code
I have the excact same issue, always when used just short of 270K tokens, which match the available token count when using the pro / plus sub in codex.
> i use github copilot subscription and my context window is always 100k doesnt matter which model i use (even opus/gemini pro) Yes, all models has an artificial low context...