Vercel Logo

AI Gateway setup

AI Gateway sits between your app and AI providers (Anthropic, OpenAI, etc.). It handles authentication, rate limiting, retries, cost tracking, and model fallbacks—all the production concerns you don't want to build yourself.

Outcome

Configure AI Gateway with an API key and install the AI SDK so you're ready to generate AI summaries.

Fast Track

  1. Vercel Dashboard → AI Gateway → API Keys → Create Key named ai-review-summary-key
  2. Add AI_GATEWAY_API_KEY=gw_xxx to Vercel env vars (all 3 environments) AND .env.local
  3. Run pnpm add ai, push to trigger redeploy with new env vars

Hands-on Exercise 2.1

Set up AI Gateway for your deployed app:

Requirements:

  1. Create an AI Gateway API key in Vercel dashboard
  2. Add AI_GATEWAY_API_KEY to Vercel environment variables
  3. Add the key to your local .env.local file
  4. Install the ai package (Vercel AI SDK)
  5. Redeploy to apply environment variables

Implementation hints:

  • AI Gateway is in the Vercel dashboard sidebar
  • Environment variables need to be added for all environments (Production, Preview, Development)
  • Use .env.local.example as a template
  • The ai package is version 5.0+ (unified SDK)
  • After adding env vars, trigger a new deployment

Step 1: Create AI Gateway API Key

  1. Go to your Vercel Dashboard
  2. Click AI Gateway in the top navigation bar
  3. Click Create API Key in the sidebar
  4. Click Create Key
  5. Name it ai-review-summary-key
  6. Copy the key (you'll need it in next steps)

Key format:

AI_GATEWAY_API_KEY=xxxxxxxxxxxxxxxxxxxxxxxxxx
Keep Your Key Secret

This key provides access to AI models. Never commit it to git or share it publicly.

Step 2: Add to Vercel Environment Variables

  1. Go to your project in Vercel dashboard
  2. Navigate to SettingsEnvironment Variables
  3. Click Add New
  4. Enter details:
    • Name: AI_GATEWAY_API_KEY
    • Value: (paste your key)
    • Environments: Use the default "All Environments" to include Production, Preview, and Development.
  5. Click Save

Why all three environments?

  • Production: Live site uses this
  • Preview: Branch deployments use this
  • Development: Vercel CLI (vercel dev) uses this

Step 3: Configure Local Environment

Create .env.local in your project root:

# .env.local
AI_GATEWAY_API_KEY=xxxxxxxxxxxxxxxxxxxxxxxxxx

Verify it's in .gitignore:

cat .gitignore | grep .env.local

Should output: .env.local

If not, add it:

echo ".env.local" >> .gitignore

Step 4: Install AI SDK

pnpm add ai

The ai package is Vercel's unified AI SDK. It works with AI Gateway and supports multiple providers (Anthropic, OpenAI, Google, etc.).

What you get:

  • generateText() - One-shot text generation
  • streamText() - Streaming responses
  • generateObject() - Structured output with Zod
  • Provider-agnostic API

Version check:

pnpm list ai

Should show ai@5.x.x or newer.

Step 5: Redeploy with Environment Variables

Environment variables only apply to new deployments. Trigger a redeploy:

git add .
git commit -m "chore: configure AI Gateway environment variables"
git push

Vercel automatically deploys. The new deployment will have access to AI_GATEWAY_API_KEY.

Verify deployment:

  1. Wait for deployment to complete
  2. Check deployment logs - no env var warnings
  3. Your app is ready for AI features (we'll add them in the next lesson)

Understanding AI Gateway

Without AI Gateway:

Your App → Anthropic API
        ↓
- Manage API keys yourself
- Handle rate limits manually
- No cost tracking
- No failover
- Direct billing from Anthropic

With AI Gateway:

Your App → AI Gateway → Anthropic API
          ↓
- Single API key
- Auto rate limiting
- Cost dashboard
- Automatic retries
- Model fallbacks
- Vercel billing

Key benefits:

  1. Unified interface - One API key for all AI providers
  2. Production resilience - Retries, timeouts, fallbacks
  3. Cost tracking - Dashboard shows usage per model
  4. Rate limiting - Prevents runaway costs
  5. Provider flexibility - Switch models without code changes

AI Gateway Dashboard

Visit your AI Gateway dashboard to see:

  • API Calls - Total requests
  • Token Usage - Input + output tokens
  • Cost - Estimated spend
  • Models Used - Which models are being called
  • Errors - Failed requests

Currently:

  • 0 API calls (we haven't made any yet)
  • $0.00 cost

We'll generate our first AI summaries in the next lesson and watch these numbers update.

Model Strings

AI Gateway uses strings for each model name:

// Anthropic Claude
model: "anthropic/claude-sonnet-4.5"
 
// OpenAI GPT
model: "openai/gpt-4-turbo"
 
// Google Gemini
model: "google/gemini-2.0-flash-001"

Format: provider/model-name

No provider-specific packages needed. The AI SDK handles everything.

Environment Variable Best Practices

Local development:

  • .env.local - Your actual key (git-ignored)
  • .env.local.example - Template (committed)

Production:

  • Vercel Environment Variables - Encrypted, accessible to your app
  • Never hardcode keys in source code

Team workflow:

  1. Team member clones repo
  2. Copies .env.local.example to .env.local
  3. Adds their own AI Gateway key
  4. Runs pnpm dev

Commit

git add .env.local.example .gitignore package.json pnpm-lock.yaml
git commit -m "chore: set up AI Gateway and install AI SDK"
git push

Done-When

  • AI Gateway API key created
  • Key added to Vercel environment variables (all 3 environments)
  • Local .env.local file created with key
  • AI SDK (ai package) installed
  • Redeployed with environment variables
  • .env.local.example created for team

What's Next

Your app is configured for AI. In the next lesson, you'll write your first AI-powered feature: a summarizeReviews function that uses Claude to generate review summaries. You'll see generateText in action and watch your AI Gateway dashboard track usage.


Sources: