403 response for /chat/completions
Describe the bug
openai.error.APIError: Invalid response object from API: '{"object":"error","message":"Only /v1/chat/completions && /v1/embeddings allowed now , your path {/chat/completions}","code":40301}' (HTTP response code was 403)
Reproduce
你好
Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.
Python Version: 3.11.6
Pip Version: 23.3.1
Open-interpreter Version: cmd:Interpreter, pkg: 0.1.15
OS Version and Architecture: Windows-10-10.0.19045-SP0
CPU Info: Intel64 Family 6 Model 142 Stepping 9, GenuineIntel
RAM Info: 3.92 GB, used: 3.58, free: 0.33
Interpreter Info
Vision: False
Model: gpt-3.5-turbo
Function calling: True
Context window: None
Max tokens: None
Auto run: False
API base: None
Local: False
Curl output: Not local
Expected behavior
123
Screenshots
No response
Open Interpreter version
pip show open-interpreter
Python version
3.1.16
Operating System name and version
win10
Additional context
No response
Hi, @lejieqwe could you share the command you launched interpreter with?
你好
https://github.com/KillianLucas/open-interpreter/pull/799 Possibly related to this?
@lejieqwe Is it working now? Can I close this issue?
Closing this stale issue. Please create a new issue if the problem is not resolved or explained in the documentation. Thanks!