Documentation Index
Fetch the complete documentation index at: https://docs.morphllm.com/llms.txt
Use this file to discover all available pages before exploring further.
Morph + Vercel AI SDK
Stream code edits at 10,500+ tokens/second using the Vercel AI SDK with Morph’s fast apply model. Use Vercel’s AI Gateway for unified billing, rate limits, and failover across 100+ AI models.
Setup
Option 1: AI Gateway (Recommended)
- Get an AI Gateway API key from Vercel
- Add it to your environment variables as
OPENAI_API_KEY
- Install the AI SDK:
Option 2: Direct API
- Get a Morph API key from the Morph dashboard
- Add it to your environment variables as
MORPH_API_KEY
- Install the AI SDK:
Implementation
import { streamText } from 'ai'
import { createOpenAI } from '@ai-sdk/openai'
const openai = createOpenAI({
apiKey: process.env.OPENAI_API_KEY!,
baseURL: 'https://gateway.ai.vercel.com/v1',
headers: {
'X-Vercel-AI-Provider': 'morph',
},
})
export async function POST(req: Request) {
const { editInstructions, originalCode, update } = await req.json()
// Get the morph model through AI Gateway
const model = openai('morph-v3-fast')
// Call the language model with the prompt
const result = streamText({
model,
messages: [
{
role: 'user',
content: `<instruction>${editInstructions}</instruction>\n<code>${originalCode}</code>\n<update>${update}</update>`
}
],
topP: 1,
})
// Respond with a streaming response
return result.toAIStreamResponse()
}
That’s it! Stream fast code edits with Morph using the Vercel AI SDK.