Integrazione · Configurazione in 60 secondi · Zero markup
Vercel AI SDK + OrcaRouter
The Vercel AI SDK is the de-facto way to build streaming LLM apps in Next.js, SvelteKit, and Nuxt. Its OpenAI provider accepts a custom baseURL — wire it to OrcaRouter and ship every model on the catalog with one key.
Configurazione
Pronto in cinque passi.
- 1.Install: npm install ai @ai-sdk/openai
- 2.Set ORCAROUTER_API_KEY in your .env.local
- 3.Use createOpenAI({ baseURL, apiKey }) with baseURL=https://api.orcarouter.ai/v1
- 4.Call streamText / generateText / generateObject as usual.
- 5.Every streaming token routes through the cheapest healthy backend.
Configurazione
import { createOpenAI } from '@ai-sdk/openai';
import { streamText } from 'ai';
const orca = createOpenAI({
baseURL: 'https://api.orcarouter.ai/v1',
apiKey: process.env.ORCAROUTER_API_KEY,
});
const result = await streamText({
model: orca('claude-sonnet-4'),
prompt: 'Write a haiku about TypeScript.',
});
// In a Next.js route handler:
return result.toDataStreamResponse();Perché instradare Vercel AI SDK attraverso OrcaRouter?
The Vercel AI SDK's model abstraction already gives you portability across OpenAI, Anthropic, etc. Adding OrcaRouter on top means you don't need to manage N API keys or hit N different rate limits — one key, one bill, automatic failover under the hood.
Altre integrazioni
Instrada subito Vercel AI SDK attraverso OrcaRouter.
Registrati in un minuto, ottieni una chiave sk-orca-… e incollala in Vercel AI SDK. Zero markup sui token, failover automatico tra tutti i provider.
