Integration · 60-second setup · Zero markup
Vercel AI SDK + OrcaRouter
The Vercel AI SDK is the de-facto way to build streaming LLM apps in Next.js, SvelteKit, and Nuxt. Its OpenAI provider accepts a custom baseURL — wire it to OrcaRouter and ship every model on the catalog with one key.
إعداد
خمس خطوات.
- 1.Install: npm install ai @ai-sdk/openai
- 2.Set ORCAROUTER_API_KEY in your .env.local
- 3.Use createOpenAI({ baseURL, apiKey }) with baseURL=https://api.orcarouter.ai/v1
- 4.Call streamText / generateText / generateObject as usual.
- 5.Every streaming token routes through the cheapest healthy backend.
Configuration
import { createOpenAI } from '@ai-sdk/openai';
import { streamText } from 'ai';
const orca = createOpenAI({
baseURL: 'https://api.orcarouter.ai/v1',
apiKey: process.env.ORCAROUTER_API_KEY,
});
const result = await streamText({
model: orca('claude-sonnet-4'),
prompt: 'Write a haiku about TypeScript.',
});
// In a Next.js route handler:
return result.toDataStreamResponse();Why route Vercel AI SDK through OrcaRouter?
The Vercel AI SDK's model abstraction already gives you portability across OpenAI, Anthropic, etc. Adding OrcaRouter on top means you don't need to manage N API keys or hit N different rate limits — one key, one bill, automatic failover under the hood.
Other integrations
Route Vercel AI SDK through OrcaRouter today.
Sign up in under a minute, grab an sk-orca-… key, and paste it into Vercel AI SDK. Zero markup on tokens. Automatic failover across every provider.
