Inicio/Integraciones/Vercel AI SDK
Integración · Conexión en 60 segundos · Sin margen

Vercel AI SDK + OrcaRouter

The Vercel AI SDK is the de-facto way to build streaming LLM apps in Next.js, SvelteKit, and Nuxt. Its OpenAI provider accepts a custom baseURL — wire it to OrcaRouter and ship every model on the catalog with one key.

Conexión

Listo en cinco pasos.

  1. 1.Install: npm install ai @ai-sdk/openai
  2. 2.Set ORCAROUTER_API_KEY in your .env.local
  3. 3.Use createOpenAI({ baseURL, apiKey }) with baseURL=https://api.orcarouter.ai/v1
  4. 4.Call streamText / generateText / generateObject as usual.
  5. 5.Every streaming token routes through the cheapest healthy backend.
Configuración
import { createOpenAI } from '@ai-sdk/openai';
import { streamText } from 'ai';

const orca = createOpenAI({
  baseURL: 'https://api.orcarouter.ai/v1',
  apiKey: process.env.ORCAROUTER_API_KEY,
});

const result = await streamText({
  model: orca('claude-sonnet-4'),
  prompt: 'Write a haiku about TypeScript.',
});

// In a Next.js route handler:
return result.toDataStreamResponse();
¿Por qué enrutar Vercel AI SDK a través de OrcaRouter?

The Vercel AI SDK's model abstraction already gives you portability across OpenAI, Anthropic, etc. Adding OrcaRouter on top means you don't need to manage N API keys or hit N different rate limits — one key, one bill, automatic failover under the hood.

Otras integraciones

Enruta Vercel AI SDK a través de OrcaRouter ahora mismo.

Regístrate en un minuto, obtén una clave sk-orca-…, pégala en Vercel AI SDK. Tokens sin margen y failover automático entre todos los proveedores.

Obtener API key →