Integracja · Połącz w 60 sekund · Bez narzutów
Vercel AI SDK + OrcaRouter
The Vercel AI SDK is the de-facto way to build streaming LLM apps in Next.js, SvelteKit, and Nuxt. Its OpenAI provider accepts a custom baseURL — wire it to OrcaRouter and ship every model on the catalog with one key.
Konfiguracja
Gotowe w pięciu krokach.
- 1.Install: npm install ai @ai-sdk/openai
- 2.Set ORCAROUTER_API_KEY in your .env.local
- 3.Use createOpenAI({ baseURL, apiKey }) with baseURL=https://api.orcarouter.ai/v1
- 4.Call streamText / generateText / generateObject as usual.
- 5.Every streaming token routes through the cheapest healthy backend.
Konfiguracja
import { createOpenAI } from '@ai-sdk/openai';
import { streamText } from 'ai';
const orca = createOpenAI({
baseURL: 'https://api.orcarouter.ai/v1',
apiKey: process.env.ORCAROUTER_API_KEY,
});
const result = await streamText({
model: orca('claude-sonnet-4'),
prompt: 'Write a haiku about TypeScript.',
});
// In a Next.js route handler:
return result.toDataStreamResponse();Dlaczego routować Vercel AI SDK przez OrcaRouter?
The Vercel AI SDK's model abstraction already gives you portability across OpenAI, Anthropic, etc. Adding OrcaRouter on top means you don't need to manage N API keys or hit N different rate limits — one key, one bill, automatic failover under the hood.
Inne integracje
Routuj Vercel AI SDK przez OrcaRouter już teraz.
Zarejestruj się w minutę, zdobądź klucz sk-orca-… i wklej go do Vercel AI SDK. Tokeny bez narzutów, automatyczny failover u wszystkich dostawców.
