Cannot set .env properly
Hello, I am new to coding and decide to try out auto agent. After I run auto main and select user mode, I type in my command. And this return.
Traceback (most recent call last): File "C:\Users\Cheng\anaconda3\Lib\site-packages\litellm\llms\openai\openai.py", line 602, in completion raise e File "C:\Users\Cheng\anaconda3\Lib\site-packages\litellm\llms\openai\openai.py", line 538, in completion self.make_sync_openai_chat_completion_request( File "C:\Users\Cheng\anaconda3\Lib\site-packages\litellm\llms\openai\openai.py", line 404, in make_sync_openai_chat_completion_request raise e File "C:\Users\Cheng\anaconda3\Lib\site-packages\litellm\llms\openai\openai.py", line 386, in make_sync_openai_chat_completion_request raw_response = openai_client.chat.completions.with_raw_response.create( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Cheng\anaconda3\Lib\site-packages\openai_legacy_response.py", line 364, in wrapped return cast(LegacyAPIResponse[R], func(*args, **kwargs)) ^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Cheng\anaconda3\Lib\site-packages\openai_utils_utils.py", line 279, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Cheng\anaconda3\Lib\site-packages\openai\resources\chat\completions\completions.py", line 929, in create return self._post( ^^^^^^^^^^^ File "C:\Users\Cheng\anaconda3\Lib\site-packages\openai_base_client.py", line 1276, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Cheng\anaconda3\Lib\site-packages\openai_base_client.py", line 949, in request return self._request( ^^^^^^^^^^^^^^ File "C:\Users\Cheng\anaconda3\Lib\site-packages\openai_base_client.py", line 1057, in _request raise self._make_status_error_from_response(err.response) from None openai.BadRequestError: Error code: 400 - {'error': {'message': 'Model Not Exist', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_request_error'}}
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "C:\Users\Cheng\anaconda3\Lib\site-packages\litellm\main.py", line 1595, in completion raise e File "C:\Users\Cheng\anaconda3\Lib\site-packages\litellm\main.py", line 1568, in completion response = openai_chat_completions.completion( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Cheng\anaconda3\Lib\site-packages\litellm\llms\openai\openai.py", line 612, in completion raise OpenAIError( litellm.llms.openai.common_utils.OpenAIError: Error code: 400 - {'error': {'message': 'Model Not Exist', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_request_error'}}
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "
Does this mean I set up my .env file incorrectly? My .env file is below.
DEEPSEEK_API_KEY=sk-XXXXXXXXXXXXXXXXXXXXXXXXXXX COMPLETION_MODEL=deepseek/deepseek-chat auto main.
What should I do? How should I fix it? Thank you for your time.
In your .env file, did you put anything after sk- ? If not, you may need to apply for a Deepseek API key here: https://api-docs.deepseek.com/ and paste it in the placeholder.