ホーム/統合/Vercel AI SDK
統合 · 60 秒セットアップ · マークアップなし

Vercel AI SDK + OrcaRouter

The Vercel AI SDK is the de-facto way to build streaming LLM apps in Next.js, SvelteKit, and Nuxt. Its OpenAI provider accepts a custom baseURL — wire it to OrcaRouter and ship every model on the catalog with one key.

セットアップ

5 ステップ。

  1. 1.Install: npm install ai @ai-sdk/openai
  2. 2.Set ORCAROUTER_API_KEY in your .env.local
  3. 3.Use createOpenAI({ baseURL, apiKey }) with baseURL=https://api.orcarouter.ai/v1
  4. 4.Call streamText / generateText / generateObject as usual.
  5. 5.Every streaming token routes through the cheapest healthy backend.
設定
import { createOpenAI } from '@ai-sdk/openai';
import { streamText } from 'ai';

const orca = createOpenAI({
  baseURL: 'https://api.orcarouter.ai/v1',
  apiKey: process.env.ORCAROUTER_API_KEY,
});

const result = await streamText({
  model: orca('claude-sonnet-4'),
  prompt: 'Write a haiku about TypeScript.',
});

// In a Next.js route handler:
return result.toDataStreamResponse();
なぜ Vercel AI SDK を OrcaRouter 経由でルーティングするのか?

The Vercel AI SDK's model abstraction already gives you portability across OpenAI, Anthropic, etc. Adding OrcaRouter on top means you don't need to manage N API keys or hit N different rate limits — one key, one bill, automatic failover under the hood.

その他の統合

Vercel AI SDK を今日から OrcaRouter 経由で。

1 分以内に登録、sk-orca-… キーを取得して Vercel AI SDK に貼り付け。トークンに上乗せなし、すべてのプロバイダー間で自動フェイルオーバー。

API キーを取得 →