首頁/集成/Vercel AI SDK
集成 · 60 秒接入 · 零加價

Vercel AI SDK + OrcaRouter

The Vercel AI SDK is the de-facto way to build streaming LLM apps in Next.js, SvelteKit, and Nuxt. Its OpenAI provider accepts a custom baseURL — wire it to OrcaRouter and ship every model on the catalog with one key.

接入

五步搞定。

  1. 1.Install: npm install ai @ai-sdk/openai
  2. 2.Set ORCAROUTER_API_KEY in your .env.local
  3. 3.Use createOpenAI({ baseURL, apiKey }) with baseURL=https://api.orcarouter.ai/v1
  4. 4.Call streamText / generateText / generateObject as usual.
  5. 5.Every streaming token routes through the cheapest healthy backend.
配置
import { createOpenAI } from '@ai-sdk/openai';
import { streamText } from 'ai';

const orca = createOpenAI({
  baseURL: 'https://api.orcarouter.ai/v1',
  apiKey: process.env.ORCAROUTER_API_KEY,
});

const result = await streamText({
  model: orca('claude-sonnet-4'),
  prompt: 'Write a haiku about TypeScript.',
});

// In a Next.js route handler:
return result.toDataStreamResponse();
為什麼把 Vercel AI SDK 路由過 OrcaRouter?

The Vercel AI SDK's model abstraction already gives you portability across OpenAI, Anthropic, etc. Adding OrcaRouter on top means you don't need to manage N API keys or hit N different rate limits — one key, one bill, automatic failover under the hood.

其他集成

立即把 Vercel AI SDK 路由過 OrcaRouter。

一分鐘註冊,拿一把 sk-orca-… key,貼到 Vercel AI SDK 裡。token 零加價,跨所有 provider 自動 failover。

獲取 API key →