01 icon indicating copy to clipboard operation
01 copied to clipboard

Litellm/01 is unable to connect to non-openAI providers.

Open mitstastic opened this issue 1 year ago • 13 comments

What causes the issue: Run 01 specifying any non OAI server-host and api key

Expected: Be able to connect to other services like Groq, Anthropic, OpenRouter etc as the seem to be working with the base Open Intepreter

Screenshots: Screenshot 2024-05-13 at 3 58 25 AM Using:

  • macOS Ventura 13.6.5
  • macOS Sonoma 14.4.1
  • Windows 10
  • Python 3.9-3.11.8

Feedback After many attempts using different settings, it seems either 01 is not passing the right arguments to litellm, or litellm isn't yet correctly configured for other providers for 01

mitstastic avatar May 13 '24 20:05 mitstastic

Update: This error above was probably caused by using a url argument with --server-host instead of --server-url erroneously, however the connection still doesn't open with the latter--See pic below

Screenshot 2024-05-13 at 6 09 33 PM

mitstastic avatar May 13 '24 22:05 mitstastic

This is also the question I want to ask. It turns out that the command line is written like this. Do I need to install the litellm service first and start it to obtain the local connection interface?

rwmjhb avatar May 14 '24 01:05 rwmjhb

Since OpenInterpreter uses litellm, I think you need to specify this differently. Here is what I think would work: 'poetry run 01 model "groq/gemma-7b-it --tts-service piper --stt-service local-whisper'."

Litellm already pulls all the data automatically if you specify the provider in the model. Or at least it should do that.

Here are some instructions on how to get it to work with open router: https://discordapp.com/channels/1146610656779440188/1194880263122075688/1240334434352365569

Merlinvt avatar May 14 '24 11:05 Merlinvt

Well as you know in the discord community some people seemed to suggest 01 is automatically appending "openai/" before the model names specified in the arguments. So for instance you might end up with "openai/groq/gemma-7b-it". Is that what's causing the issue?

Litellm already pulls all the data automatically if you specify the provider in the model. Or at least it should do that.

If it does, why the need to specify all the details when people use the open interpreter directly? And based on my experience when I leave out the server arguments it seems to default to OAI and complain about no OAI key set. So I think something in the litellm code for 01 is probably interfering or not fully configured to support other providers yet, as it's only been confirmed working with GPT.

mitstastic avatar May 15 '24 03:05 mitstastic

Does the project side care about it? No developer has responded to questions for so many days?

rwmjhb avatar May 16 '24 13:05 rwmjhb

If you want to get the 01 to work with open router ( and others? ), you can try this:

Screenshot_from_2024-05-15_18-05-36.png

Screenshot_from_2024-05-15_18-05-49.png

It's still super unintuitive and I think maybe should be made more intuitive. But you can make it work.

The openai key is for whisper and TTS. If you use a local model you can leave this out.

I also forgot the "poetry install" before "poetry run" ?

Different model name would be "openrouter/meta-llama/llama-3-70b"

Merlinvt avatar May 16 '24 14:05 Merlinvt

What if I haven't openai key and also local model,how can I use whisper and TTS?Can I only use openrouter apikey for all fucntion?

rwmjhb avatar May 18 '24 04:05 rwmjhb

Openrouter does not have whisper. There is a rewrite on the way that implements more options for TTS and STT. https://github.com/KillianLucas/01-rewrite I don't think they will implement any options in this repo, so without Openai or the local models you might need to wait until the rewrite is done. But I could be wrong. You can use Open Interpreter until then.

Merlinvt avatar May 18 '24 04:05 Merlinvt

this is how i ran 01 with groq and local tts/stt/ changing i.py as per following diff and also running with / poetry run 01 --stt-service local-whisper --tts-service piper

diff --git a/software/source/server/i.py b/software/source/server/i.py
index bc792fd..f7a7454 100644
--- a/software/source/server/i.py
+++ b/software/source/server/i.py
@@ -185,10 +185,14 @@ def configure_interpreter(interpreter: OpenInterpreter):
     ### SYSTEM MESSAGE
     interpreter.system_message = system_message
 
-    interpreter.llm.supports_vision = True
+    interpreter.llm.supports_vision = False
     interpreter.shrink_images = True  # Faster but less accurate
 
-    interpreter.llm.model = "gpt-4"
+    # RUN WITH THIS COMMAND FOR LOCAL TTS AND STT 
+    # `poetry run 01 --stt-service local-whisper --tts-service piper`
+    interpreter.llm.model = "llama3-70b-8192"
+    interpreter.llm.api_base = "https://api.groq.com/openai/v1/"
+    interpreter.llm.api_key = "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" # YOUR API HERE
 
     interpreter.llm.supports_functions = False
     interpreter.llm.context_window = 110000

aj47 avatar May 19 '24 09:05 aj47

Can you tell which of the line we need to change??

achoozachooz avatar May 19 '24 12:05 achoozachooz

@aj47 just making sure, that the api key is fake or revoked ;)

Merlinvt avatar May 20 '24 05:05 Merlinvt

@aj47 just making sure, that the api key is fake or revoked ;)

yep all g, revoked before posting

aj47 avatar May 20 '24 07:05 aj47

this is how i ran 01 with groq and local tts/stt/ changing i.py as per following diff and also running with / poetry run 01 --stt-service local-whisper --tts-service piper

diff --git a/software/source/server/i.py b/software/source/server/i.py
index bc792fd..f7a7454 100644
--- a/software/source/server/i.py
+++ b/software/source/server/i.py
@@ -185,10 +185,14 @@ def configure_interpreter(interpreter: OpenInterpreter):
     ### SYSTEM MESSAGE
     interpreter.system_message = system_message
 
-    interpreter.llm.supports_vision = True
+    interpreter.llm.supports_vision = False
     interpreter.shrink_images = True  # Faster but less accurate
 
-    interpreter.llm.model = "gpt-4"
+    # RUN WITH THIS COMMAND FOR LOCAL TTS AND STT 
+    # `poetry run 01 --stt-service local-whisper --tts-service piper`
+    interpreter.llm.model = "llama3-70b-8192"
+    interpreter.llm.api_base = "https://api.groq.com/openai/v1/"
+    interpreter.llm.api_key = "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" # YOUR API HERE
 
     interpreter.llm.supports_functions = False
     interpreter.llm.context_window = 110000

Thank you Sir, this it works for me, really, really appreciated

guoper59 avatar Jun 28 '24 12:06 guoper59