AI Gateway setup
AI Gateway sits between your app and AI providers (Anthropic, OpenAI, etc.). It handles authentication, rate limiting, retries, cost tracking, and model fallbacks—all the production concerns you don't want to build yourself.
Outcome
Configure AI Gateway with an API key and install the AI SDK so you're ready to generate AI summaries.
Fast Track
- Vercel Dashboard → AI Gateway → API Keys → Create Key named
ai-review-summary-key - Add
AI_GATEWAY_API_KEY=gw_xxxto Vercel env vars (all 3 environments) AND.env.local - Run
pnpm add ai, push to trigger redeploy with new env vars
Hands-on Exercise 2.1
Set up AI Gateway for your deployed app:
Requirements:
- Create an AI Gateway API key in Vercel dashboard
- Add
AI_GATEWAY_API_KEYto Vercel environment variables - Add the key to your local
.env.localfile - Install the
aipackage (Vercel AI SDK) - Redeploy to apply environment variables
Implementation hints:
- AI Gateway is in the Vercel dashboard sidebar
- Environment variables need to be added for all environments (Production, Preview, Development)
- Use
.env.local.exampleas a template - The
aipackage is version 5.0+ (unified SDK) - After adding env vars, trigger a new deployment
Step 1: Create AI Gateway API Key
- Go to your Vercel Dashboard
- Click AI Gateway in the top navigation bar
- Click Create API Key in the sidebar
- Click Create Key
- Name it
ai-review-summary-key - Copy the key (you'll need it in next steps)
Key format:
AI_GATEWAY_API_KEY=xxxxxxxxxxxxxxxxxxxxxxxxxx
This key provides access to AI models. Never commit it to git or share it publicly.
Step 2: Add to Vercel Environment Variables
- Go to your project in Vercel dashboard
- Navigate to Settings → Environment Variables
- Click Add New
- Enter details:
- Name:
AI_GATEWAY_API_KEY - Value: (paste your key)
- Environments: Use the default "All Environments" to include Production, Preview, and Development.
- Name:
- Click Save
Why all three environments?
- Production: Live site uses this
- Preview: Branch deployments use this
- Development: Vercel CLI (
vercel dev) uses this
Step 3: Configure Local Environment
Create .env.local in your project root:
# .env.local
AI_GATEWAY_API_KEY=xxxxxxxxxxxxxxxxxxxxxxxxxxVerify it's in .gitignore:
cat .gitignore | grep .env.localShould output: .env.local
If not, add it:
echo ".env.local" >> .gitignoreStep 4: Install AI SDK
pnpm add aiThe ai package is Vercel's unified AI SDK. It works with AI Gateway and supports multiple providers (Anthropic, OpenAI, Google, etc.).
What you get:
generateText()- One-shot text generationstreamText()- Streaming responsesgenerateObject()- Structured output with Zod- Provider-agnostic API
Version check:
pnpm list aiShould show ai@5.x.x or newer.
Step 5: Redeploy with Environment Variables
Environment variables only apply to new deployments. Trigger a redeploy:
git add .
git commit -m "chore: configure AI Gateway environment variables"
git pushVercel automatically deploys. The new deployment will have access to AI_GATEWAY_API_KEY.
Verify deployment:
- Wait for deployment to complete
- Check deployment logs - no env var warnings
- Your app is ready for AI features (we'll add them in the next lesson)
Understanding AI Gateway
Without AI Gateway:
Your App → Anthropic API
↓
- Manage API keys yourself
- Handle rate limits manually
- No cost tracking
- No failover
- Direct billing from Anthropic
With AI Gateway:
Your App → AI Gateway → Anthropic API
↓
- Single API key
- Auto rate limiting
- Cost dashboard
- Automatic retries
- Model fallbacks
- Vercel billing
Key benefits:
- Unified interface - One API key for all AI providers
- Production resilience - Retries, timeouts, fallbacks
- Cost tracking - Dashboard shows usage per model
- Rate limiting - Prevents runaway costs
- Provider flexibility - Switch models without code changes
AI Gateway Dashboard
Visit your AI Gateway dashboard to see:
- API Calls - Total requests
- Token Usage - Input + output tokens
- Cost - Estimated spend
- Models Used - Which models are being called
- Errors - Failed requests
Currently:
- 0 API calls (we haven't made any yet)
- $0.00 cost
We'll generate our first AI summaries in the next lesson and watch these numbers update.
Model Strings
AI Gateway uses strings for each model name:
// Anthropic Claude
model: "anthropic/claude-sonnet-4.5"
// OpenAI GPT
model: "openai/gpt-4-turbo"
// Google Gemini
model: "google/gemini-2.0-flash-001"Format: provider/model-name
No provider-specific packages needed. The AI SDK handles everything.
Environment Variable Best Practices
Local development:
.env.local- Your actual key (git-ignored).env.local.example- Template (committed)
Production:
- Vercel Environment Variables - Encrypted, accessible to your app
- Never hardcode keys in source code
Team workflow:
- Team member clones repo
- Copies
.env.local.exampleto.env.local - Adds their own AI Gateway key
- Runs
pnpm dev
Commit
git add .env.local.example .gitignore package.json pnpm-lock.yaml
git commit -m "chore: set up AI Gateway and install AI SDK"
git pushDone-When
- AI Gateway API key created
- Key added to Vercel environment variables (all 3 environments)
- Local
.env.localfile created with key - AI SDK (
aipackage) installed - Redeployed with environment variables
.env.local.examplecreated for team
What's Next
Your app is configured for AI. In the next lesson, you'll write your first AI-powered feature: a summarizeReviews function that uses Claude to generate review summaries. You'll see generateText in action and watch your AI Gateway dashboard track usage.
Sources:
Was this helpful?