autogen icon indicating copy to clipboard operation
autogen copied to clipboard

Amazon Bedrock Client for AutoGen

Open Hk669 opened this issue 1 year ago β€’ 11 comments

Why are these changes needed?

Amazon Bedrock also offers a broad set of capabilities to build generative AI applications with security, privacy, and responsible AI. The AWS bedrock has acquired all the latest open-source and closed-source models for inference, making it more flexible for the users to manage the complete infrastructure from AWS. this addition will increase the capability of AutoGen to support the models from bedrock.

API Documentation: https://docs.aws.amazon.com/bedrock/latest/APIReference/API_runtime_Converse.html

For testing:

# Authentication parameters:
# aws_region (mandatory)
# aws_access_key (or environment variable: AWS_ACCESS_KEY)
# aws_secret_key (or environment variable: AWS_SECRET_KEY)
# aws_session_token (or environment variable: AWS_SESSION_TOKEN)
# aws_profile_name

config_list = [
    {
        "api_type": "bedrock",
        "model": "meta.llama3-1-8b-instruct-v1:0",
        "aws_region_name": "us-west-2",
        "aws_access_key": "",
        "aws_secret_key": "",
        "price" : [0.003, 0.015]
    }
]

Related issue number

Checks

  • [X] I've included any doc changes needed for https://microsoft.github.io/autogen/. See https://microsoft.github.io/autogen/docs/Contribute#documentation to build and test documentation locally.
  • [X] I've added tests (if relevant) corresponding to the changes introduced in this PR.
  • [X] I've made sure all auto checks have passed.

Hk669 avatar Jul 27 '24 18:07 Hk669

I've committed the first full draft of the client class, largely based on (Discord) @astroalek and @Christian T's code, thanks!

Still plenty of testing to do (have not tested images). Streaming is not currently supported and I think that we could go without that for the first round.

marklysze avatar Aug 01 '24 10:08 marklysze

Codecov Report

Attention: Patch coverage is 16.04938% with 204 lines in your changes missing coverage. Please review.

Project coverage is 20.05%. Comparing base (6279247) to head (467c5fe). Report is 111 commits behind head on main.

Files Patch % Lines
autogen/oai/bedrock.py 14.97% 193 Missing :warning:
autogen/oai/client.py 38.46% 7 Missing and 1 partial :warning:
autogen/logger/file_logger.py 0.00% 1 Missing :warning:
autogen/logger/sqlite_logger.py 0.00% 1 Missing :warning:
autogen/runtime_logging.py 0.00% 1 Missing :warning:
Additional details and impacted files
@@             Coverage Diff             @@
##             main    #3232       +/-   ##
===========================================
- Coverage   32.90%   20.05%   -12.86%     
===========================================
  Files          94      102        +8     
  Lines       10235    11012      +777     
  Branches     2193     2526      +333     
===========================================
- Hits         3368     2208     -1160     
- Misses       6580     8582     +2002     
+ Partials      287      222       -65     
Flag Coverage Ξ”
unittests 20.01% <16.04%> (-12.90%) :arrow_down:

Flags with carried forward coverage won't be shown. Click here to find out more.

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

codecov-commenter avatar Aug 01 '24 10:08 codecov-commenter

Updated to support images in the request, example:

# THIS TESTS: TESTS A MODEL CHECKING AN IMAGE.

altmodel_llm_config = {
    "config_list":
    [
        {
            "api_type": "bedrock",
            "model": "anthropic.claude-3-sonnet-20240229-v1:0",
            "aws_region_name": "us-east-1",
            "aws_access_key_id": "",
            "aws_secret_access_key": "",
            "cache_seed": None
        }
    ]
}

import autogen
from autogen import Agent, AssistantAgent, ConversableAgent, UserProxyAgent
from autogen.agentchat.contrib.capabilities.vision_capability import VisionCapability
from autogen.agentchat.contrib.img_utils import get_pil_image, pil_to_data_uri
from autogen.agentchat.contrib.multimodal_conversable_agent import MultimodalConversableAgent
from autogen.code_utils import content_str

image_agent = MultimodalConversableAgent(
    name="image-explainer",
    max_consecutive_auto_reply=10,
    llm_config=altmodel_llm_config,
)

user_proxy = autogen.UserProxyAgent(
    name="User_proxy",
    system_message="A human admin.",
    human_input_mode="NEVER",
    max_consecutive_auto_reply=0,
    code_execution_config={
        "use_docker": False
    },  # Please set use_docker=True if docker is available to run the generated code. Using docker is safer than running the generated code directly.
)

# Ask the question with an image
result = user_proxy.initiate_chat(
    image_agent,
    message="""What's the breed of this dog?
<img https://th.bing.com/th/id/R.422068ce8af4e15b0634fe2540adea7a?rik=y4OcXBE%2fqutDOw&pid=ImgRaw&r=0>.""",
)

print(result.summary)

marklysze avatar Aug 07 '24 00:08 marklysze

@Hk669, I can't add you as a reviewer, but if you are able to review the code it would be great.

marklysze avatar Aug 08 '24 03:08 marklysze

Hey @wenngong, @joris-swapfiets, if you are able to help test this dedicated Amazon Bedrock client class it would be appreciated :).

marklysze avatar Aug 08 '24 03:08 marklysze

Hey @wenngong, @joris-swapfiets, if you are able to help test this dedicated Amazon Bedrock client class it would be appreciated :).

@marklysze, tested, the bedrock client works fine in my testings.

Zizo-Vi avatar Aug 08 '24 10:08 Zizo-Vi

Oh I just stumbled upon this PR, after we finished implementing in a client our custom bedrock agent (heavily modifying the anthropic.py file) and now our agents, running in lambda, use the bedrock api to connect to any model without issues.
I will wait until this implementation is finished so we can compare aproaches πŸ’ͺ🏼

Bateristico avatar Aug 08 '24 18:08 Bateristico

Oh I just stumbled upon this PR, after we finished implementing in a client our custom bedrock agent (heavily modifying the anthropic.py file) and now our agents, running in lambda, use the bedrock api to connect to any model without issues.
I will wait until this implementation is finished so we can compare aproaches πŸ’ͺ🏼

Hey @Bateristico, thanks for the comment. Sounds good, if you do notice any areas of improvement, please feel free to shout them out. :)

marklysze avatar Aug 08 '24 19:08 marklysze

Hey @wenngong, @joris-swapfiets, if you are able to help test this dedicated Amazon Bedrock client class it would be appreciated :).

@marklysze, tested, the bedrock client works fine in my testings.

Thanks so much @wenngong! If you get a chance to approve it, that would be great :)

marklysze avatar Aug 08 '24 22:08 marklysze

@wenngong, thanks for your review, I've updated accordingly. @Hk669, I'll review tests again when you have had a chance to note which ones can be removed.

marklysze avatar Aug 09 '24 20:08 marklysze

Thanks for approving @wenngong, @Hk669 - are you happy to keep tests as is or would you like to have some removed? If you are happy to keep as is I'll mark as approved :).

marklysze avatar Aug 12 '24 06:08 marklysze

looks good to meπŸ‘, thanks for the efforts @marklysze

Hk669 avatar Aug 16 '24 13:08 Hk669

looks good to meπŸ‘, thanks for the efforts @marklysze

Thanks @Hk669! I'll approve on your behalf :)

marklysze avatar Aug 16 '24 23:08 marklysze