Intégration · Configuration en 60 secondes · Sans majoration
Vercel AI SDK + OrcaRouter
The Vercel AI SDK is the de-facto way to build streaming LLM apps in Next.js, SvelteKit, and Nuxt. Its OpenAI provider accepts a custom baseURL — wire it to OrcaRouter and ship every model on the catalog with one key.
Configuration
Cinq étapes.
- 1.Install: npm install ai @ai-sdk/openai
- 2.Set ORCAROUTER_API_KEY in your .env.local
- 3.Use createOpenAI({ baseURL, apiKey }) with baseURL=https://api.orcarouter.ai/v1
- 4.Call streamText / generateText / generateObject as usual.
- 5.Every streaming token routes through the cheapest healthy backend.
Configuration
import { createOpenAI } from '@ai-sdk/openai';
import { streamText } from 'ai';
const orca = createOpenAI({
baseURL: 'https://api.orcarouter.ai/v1',
apiKey: process.env.ORCAROUTER_API_KEY,
});
const result = await streamText({
model: orca('claude-sonnet-4'),
prompt: 'Write a haiku about TypeScript.',
});
// In a Next.js route handler:
return result.toDataStreamResponse();Pourquoi router Vercel AI SDK via OrcaRouter ?
The Vercel AI SDK's model abstraction already gives you portability across OpenAI, Anthropic, etc. Adding OrcaRouter on top means you don't need to manage N API keys or hit N different rate limits — one key, one bill, automatic failover under the hood.
Autres intégrations
Routez Vercel AI SDK via OrcaRouter dès aujourd'hui.
Inscrivez-vous en moins d'une minute, récupérez une clé sk-orca-… et collez-la dans Vercel AI SDK. Aucune majoration sur les tokens. Basculement automatique entre tous les providers.
