vocode-python icon indicating copy to clipboard operation
vocode-python copied to clipboard

Add support for Ollama, Palm, Claude-2, Cohere, Replicate Llama2, CodeLlama, Hugging Face (100+LLMs) - using LiteLLM

Open ishaan-jaff opened this issue 2 years ago • 7 comments

from litellm import completion

## set ENV variables
os.environ["OPENAI_API_KEY"] = "openai key"
os.environ["COHERE_API_KEY"] = "cohere key"

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# cohere call
response = completion(model="command-nightly", messages)

# anthropic call
response = completion(model="claude-instant-1", messages=messages)

ishaan-jaff avatar Oct 05 '23 23:10 ishaan-jaff

Addressing: https://github.com/vocodedev/vocode-python/issues/375 https://github.com/vocodedev/vocode-python/issues/8

ishaan-jaff avatar Oct 05 '23 23:10 ishaan-jaff

Can I get a review on this PR @ajar98 @Kian1354 ?

ishaan-jaff avatar Oct 05 '23 23:10 ishaan-jaff

do you want one file called llm.py where all the litellm calls happen ?

ishaan-jaff avatar Oct 05 '23 23:10 ishaan-jaff

Hello. Maybe it's better to create a separate LiteLLM agent with its own config rather than overwriting native OpenAI agent?

skripnik avatar Dec 21 '23 04:12 skripnik

This PR has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

github-actions[bot] avatar Mar 09 '24 01:03 github-actions[bot]

hi @ishaan-jaff

I think creating a separate LiteLLM agent would be ideal to maintain modularity and ease of integration with different models. If needed, I'm willing to work on this based on the current PR, ensuring a smooth and efficient addition. Thanks for considering this approach, and I appreciate all the efforts in enhancing our project's capabilities.

arpagon avatar Mar 09 '24 01:03 arpagon

This PR has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

github-actions[bot] avatar May 09 '24 01:05 github-actions[bot]

This PR has been automatically closed due to inactivity. Thank you for your contributions.

github-actions[bot] avatar May 16 '24 01:05 github-actions[bot]

Has Anyone implemented it in LlamaCpp, if so can you share it, if anyone has made progress or has any code to share, it would be greatly appreciated

shriharshan avatar Aug 08 '24 09:08 shriharshan