CrewAI-Studio icon indicating copy to clipboard operation
CrewAI-Studio copied to clipboard

Added Amazon Bedrock integration with flagship models to CrewAI-Studio

Open taanisshhaa opened this issue 5 months ago • 0 comments

This PR introduces Amazon Bedrock support to CrewAI-Studio, enabling users to leverage AWS-hosted foundation models alongside existing providers seamlessly.

-> What’s New:

  • Added a new Bedrock provider in LLM_CONFIG.
  • Exposed 5 widely used flagship models:
    • amazon.nova-pro-v1:0
    • amazon.titan-text-express-v1
    • anthropic.claude-3-5-sonnet-20240620-v1:0
    • meta.llama3-1-70b-instruct-v1:0
    • mistral.mixtral-8x7b-instruct-v0:1

-> Implementation:

  • Added create_bedrock_llm function for Bedrock model instantiation.
  • Updated LLM_CONFIG to include Bedrock provider and its models.
  • Enabled support for AWS environment variables:
    • AWS_ACCESS_KEY_ID
    • AWS_SECRET_ACCESS_KEY

-> Testing:

  • Verified locally by running CrewAI-Studio with AWS credentials.
  • Successfully initiated completions across Nova, Titan, Claude, LLaMA, and Mixtral models.

-> Why This Matters:

  • Amazon Bedrock is becoming a key entry point for production-ready LLMs from Amazon, Anthropic, Meta, and Mistral.
  • Adding this integration:
    1. Expands CrewAI-Studio’s supported ecosystem.
    2. Gives users access to both proprietary and open-source models with a single, consistent interface.

-> Personal Note:

  • While working with CrewAI-Studio I wanted to use Amazon Bedrock, but noticed it wasn’t available. Since Bedrock is becoming a major provider, I thought it would be useful to add support. This PR is my attempt to fill that gap and hopefully make CrewAI-Studio more helpful for others in the same situation

taanisshhaa avatar Aug 18 '25 22:08 taanisshhaa