Some models are just better at certain things. Claude crushes long-form code edits. Gemini handles video attachments without breaking a sweat. GPT excels at complex reasoning. No single model wins at everything.
So instead of committing to one provider and hoping it works for every task, you set up multiple providers and route requests to whichever model fits best. Cost, capability, availability - you get to optimize for all of it.
We'll create a model mapping system that makes provider management trivial. Initialize all your providers once, map them to simple slugs, and let your agent pick the right model for each request.
The Mental Model
Here's how it works: you create a mapping object that connects model slugs (like claude-sonnet-4-5 or gemini-3-flash) to actual provider instances.
private modelMapping: Record<string, any> = {
'claude-sonnet-4-5': anthropic('claude-sonnet-4-5-20250929'),
'gemini-3-flash': google('gemini-3-flash-preview'),
'gpt-5.1-codex-max': openai('gpt-5.1-codex-max'),
};Your agent code references models by slug. When it needs to make a request, it looks up the slug in the mapping and gets the provider instance. Simple, testable, and easy to modify.
Want to switch from direct Anthropic to Bedrock? Change the mapping. Want to A/B test different models? Add a slug and route requests to it. The rest of your code doesn't change.
Setting Up Providers
Let's build the actual implementation. We'll create a method that initializes all providers and builds the mapping.
The Basic Structure
Start with a method that lives in your agent class:
private initializeModelMapping() {
// Initialize each provider
const openai = createOpenAI({
apiKey: process.env.OPENAI_API_KEY,
});
const anthropic = createAnthropic({
apiKey: process.env.ANTHROPIC_API_KEY,
});
const google = createGoogle({
apiKey: process.env.GOOGLE_API_KEY,
});
// Map slugs to provider instances
this.modelMapping = {
'gpt-5.1-codex-max': openai('gpt-5.1-codex-max'),
'claude-sonnet-4-5': anthropic('claude-sonnet-4-5-20250929'),
'gemini-3-flash': google('gemini-3-flash-preview'),
};
}Call this from your constructor, before you do anything else. Your agent needs these providers initialized before it can handle requests.
OpenAI
import { createOpenAI } from '@ai-sdk/openai';
const openai = createOpenAI({
apiKey: process.env.OPENAI_API_KEY,
});
this.modelMapping['gpt-5.1-codex-max'] = openai('gpt-5.1-codex-max');Anthropic
import { anthropic } from '@ai-sdk/anthropic';
this.modelMapping['claude-sonnet-4-5'] = anthropic('claude-sonnet-4-5-20250929');
this.modelMapping['claude-opus-4.5'] = anthropic('claude-opus-4-5-20251101');Google (AI Studio)
import { google } from '@ai-sdk/google';
this.modelMapping['gemini-3-flash'] = google('gemini-3-flash-preview');Google Vertex AI
For enterprise features (VPC-SC, data residency, better rate limits) or video attachments:
import { createVertex } from '@ai-sdk/google-vertex';
import { createVertexAnthropic } from '@ai-sdk/google-vertex/anthropic';
const vertex = createVertex({
project: process.env.GOOGLE_VERTEX_PROJECT,
location: process.env.GOOGLE_VERTEX_LOCATION || 'us-central1',
});
const vertexAnthropic = createVertexAnthropic({
project: process.env.GOOGLE_VERTEX_PROJECT,
location: process.env.GOOGLE_VERTEX_LOCATION || 'us-central1',
});
this.modelMapping['gemini-3-flash-vertex'] = vertex('gemini-3-flash-preview');
this.modelMapping['claude-sonnet-vertex'] = vertexAnthropic('claude-sonnet-4-5@20250929');AWS Bedrock
For AWS billing, IAM controls, and CloudWatch logging:
import { createAmazonBedrock } from '@ai-sdk/amazon-bedrock';
const bedrock = createAmazonBedrock({
region: process.env.AWS_REGION_NAME,
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
});
this.modelMapping['claude-opus-4.5'] = bedrock('global.anthropic.claude-opus-4-5-20251101-v1:0');
this.modelMapping['claude-haiku-4-5'] = bedrock('us.anthropic.claude-haiku-4-5-20251001-v1:0');Note: Model IDs differ from direct Anthropic. Check AWS's model catalog for exact strings.
OpenRouter
Unified API for dozens of models. Good for testing, regional availability, and avoiding rate limits:
import { createOpenRouter } from '@openrouter/ai-sdk-provider';
const openrouter = createOpenRouter({
apiKey: process.env.OPENROUTER_API_KEY,
});
this.modelMapping['claude-sonnet-4-5'] = openrouter('anthropic/claude-sonnet-4-5');
this.modelMapping['gemini-3-flash'] = openrouter('google/gemini-3-flash-preview');
this.modelMapping['gpt-4o'] = openrouter('openai/gpt-4o');Prefix model names with provider (anthropic/, google/, openai/). No separate API keys needed.
Connecting to Your Agent
Call initializeModelMapping() in your constructor. When building your agent, look up the model from the mapping:
private buildAgent(modelSlug: string) {
const model = this.modelMapping[modelSlug];
return new Agent({
name: 'Coding Agent',
instructions: this.createPrompt(),
tools: this.loadTools(modelSlug),
model: model,
});
}Environment Variables
# .env
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...
GOOGLE_API_KEY=...
OPENROUTER_API_KEY=sk-or-...
# AWS Bedrock
AWS_REGION_NAME=us-east-1
AWS_ACCESS_KEY_ID=...
AWS_SECRET_ACCESS_KEY=...
# Google Vertex AI
GOOGLE_VERTEX_PROJECT=my-project
GOOGLE_VERTEX_LOCATION=us-central1What's Next
You have providers set up. Next guide: building tools your agent can actually call - reading files, searching code, executing commands.
That's where things get interesting.