[REQUEST] Add Native LiteLLM Provider with Dynamic Model Discovery and MCP Gateway
Feature Request
Is your feature request related to a problem? Please describe.
Currently, Sim maintains its own list of supported models in the Agent block dropdown. This requires:
- Code updates to add new models
- Rebuilds/redeployments when models change
- Maintenance of model-specific configurations
LiteLLM already solved model management for 100+ providers and handles:
- Model discovery
- Provider configuration
- Authentication
- Updates for new releases
Describe the solution you'd like
Add LiteLLM as a native provider and delegate model management to it:
Configuration:
LITELLM_PROVIDER=enabled
LITELLM_PROXY_URL=http://litellm:4000
LITELLM_API_KEY=sk-1234
How it works:
- Sim queries LiteLLM's
/v1/modelsendpoint - Agent block dropdown shows all models configured in LiteLLM
- Model list updates automatically when LiteLLM config changes
- No Sim code changes needed to support new models
Benefits:
Model Management:
- Decoupled model management - LiteLLM handles model catalog
- Instant access - New models available immediately after LiteLLM config update
- No Sim updates needed - Model list managed externally
- Broader model support - Access to 100+ providers via LiteLLM
MCP Gateway Integration:
- Centralized MCP management - Configure MCP servers once in LiteLLM, accessible across all Sim workspaces
- Multi-transport support - HTTP, SSE, stdio without Sim code changes
- Access control - Team-based and key-level MCP permissions via LiteLLM
- Tool filtering - Control which MCP tools are available per team
- Cost tracking - Track MCP tool call costs per team/project
Prompt Management:
- Version control - Track prompt changes across deployments
- Template management - Centralized prompt templates and variables
- Audit trail - Standardized logging for prompt usage
Describe alternatives you've considered
Alternative 1: Continue maintaining models in Sim
- Requires ongoing maintenance for new models
- Slower to add new providers
- Duplicate effort since LiteLLM already manages this
Alternative 2: Use Azure OpenAI endpoint workaround
- Works but limited to models in Sim's dropdown
- Doesn't leverage LiteLLM's MCP Gateway or prompt management
Alternative 3: Allow custom model text input
Additional context
Architectural benefit: Separation of concerns - Sim focuses on workflow orchestration while LiteLLM handles model infrastructure and MCP gateway.
┌──────────────┐
│ Sim │ Handles: Workflows, blocks, execution
│ Agent Block │───────┐
└──────────────┘ │
▼
┌──────────────┐
│ LiteLLM │ Handles: Models, MCP, prompts
│ │
│ - Discovery │ Features:
│ - Routing │ • 100+ model providers
│ - MCP Gateway│ • MCP access control
│ - Prompts │ • Cost tracking
└──────────────┘ • Guardrails
Examples unlocked:
Models:
- AWS Bedrock (Claude, Llama, Titan, etc.)
- Custom Azure deployments
- Self-hosted models (vLLM, TGI)
- Vertex AI, Replicate, Together AI
MCP:
- Team-based MCP server access
- OAuth 2.0 for MCP authentication
- Cost allocation per team
- Tool-level filtering
Reference: LiteLLM MCP Gateway
Impact: Reduces Sim's maintenance burden while expanding model access and enabling enterprise MCP governance.
This would be awesome, along with the inevitable follow-up request that'll be "can we hide all the other models that aren't from LiteLLM?" from anyone whose using it!