Integration · 60-Sekunden-Setup · Kein Aufschlag

Vercel AI SDK + OrcaRouter

The Vercel AI SDK is the de-facto way to build streaming LLM apps in Next.js, SvelteKit, and Nuxt. Its OpenAI provider accepts a custom baseURL — wire it to OrcaRouter and ship every model on the catalog with one key.

Einrichtung

In fünf Schritten erledigt.

  1. 1.Install: npm install ai @ai-sdk/openai
  2. 2.Set ORCAROUTER_API_KEY in your .env.local
  3. 3.Use createOpenAI({ baseURL, apiKey }) with baseURL=https://api.orcarouter.ai/v1
  4. 4.Call streamText / generateText / generateObject as usual.
  5. 5.Every streaming token routes through the cheapest healthy backend.
Konfiguration
import { createOpenAI } from '@ai-sdk/openai';
import { streamText } from 'ai';

const orca = createOpenAI({
  baseURL: 'https://api.orcarouter.ai/v1',
  apiKey: process.env.ORCAROUTER_API_KEY,
});

const result = await streamText({
  model: orca('claude-sonnet-4'),
  prompt: 'Write a haiku about TypeScript.',
});

// In a Next.js route handler:
return result.toDataStreamResponse();
Warum Vercel AI SDK über OrcaRouter routen?

The Vercel AI SDK's model abstraction already gives you portability across OpenAI, Anthropic, etc. Adding OrcaRouter on top means you don't need to manage N API keys or hit N different rate limits — one key, one bill, automatic failover under the hood.

Weitere Integrationen

Routen Sie Vercel AI SDK jetzt über OrcaRouter.

In einer Minute registrieren, einen sk-orca-…-Key abholen, in Vercel AI SDK einfügen. Token ohne Aufschlag, automatischer Failover über alle Anbieter.

API-Key abrufen →