context-is-key-forecasting icon indicating copy to clipboard operation
context-is-key-forecasting copied to clipboard

Add LiteLLM - support for Vertex AI, Gemini, Anthropic, Bedrock (100+LLMs)

Open ishaan-jaff opened this issue 1 year ago • 3 comments

Add LiteLLM - support for Vertex AI, Gemini, Anthropic, Bedrock (100+LLMs)

What's changing

This PR adds support for the above mentioned LLMs using LiteLLM https://github.com/BerriAI/litellm/ LiteLLM is a lightweight package to simplify LLM API calls - use any llm as a drop in replacement for gpt-4o.

Example

from litellm import completion
import os

## set ENV variables
os.environ["OPENAI_API_KEY"] = "your-openai-key"
os.environ["ANTHROPIC_API_KEY"] = "your-cohere-key"

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="openai/gpt-4o", messages=messages)

# anthropic call
response = completion(model="anthropic/claude-3-sonnet-20240229", messages=messages)
print(response)

Response (OpenAI Format)

{
    "id": "chatcmpl-565d891b-a42e-4c39-8d14-82a1f5208885",
    "created": 1734366691,
    "model": "claude-3-sonnet-20240229",
    "object": "chat.completion",
    "system_fingerprint": null,
    "choices": [
        {
            "finish_reason": "stop",
            "index": 0,
            "message": {
                "content": "Hello! As an AI language model, I don't have feelings, but I'm operating properly and ready to assist you with any questions or tasks you may have. How can I help you today?",
                "role": "assistant",
                "tool_calls": null,
                "function_call": null
            }
        }
    ],
    "usage": {
        "completion_tokens": 43,
        "prompt_tokens": 13,
        "total_tokens": 56,
        "completion_tokens_details": null,
        "prompt_tokens_details": {
            "audio_tokens": null,
            "cached_tokens": 0
        },
        "cache_creation_input_tokens": 0,
        "cache_read_input_tokens": 0
    }
}

ishaan-jaff avatar Jan 01 '25 18:01 ishaan-jaff

@aldro61 @AndrewRWilliams can I get a review on this PR ?

ishaan-jaff avatar Jan 01 '25 18:01 ishaan-jaff

@ishaan-jaff Thank you for this PR!!

A few things:

  1. Can you also add litellm to the requirements?
  2. If some models require new API keys (such as an Anthropic API key), can you add that to the environment variables list in README.md? Please prepend LiteLLM-specific keys with CIK_LITELLM_ so it'll be easier to distinguish those new keys.
  3. Have you tested if this works, with some LLMs? I'm very curious to know if any specific LLMs' performance stood out.

Thanks again!

ashok-arjun avatar Jan 06 '25 21:01 ashok-arjun

Hi @ishaan-jaff, are you still working on this PR?

ashok-arjun avatar May 03 '25 20:05 ashok-arjun

Closing this PR as there's no activity

AndrewRWilliams avatar Jul 11 '25 20:07 AndrewRWilliams