首页/集成/LangChain
集成 · 60 秒接入 · 零加价

LangChain + OrcaRouter

LangChain's ChatOpenAI class takes a base_url parameter. Pointing it at OrcaRouter gives every chain, agent, and retriever automatic failover and zero-markup pricing without touching the rest of your graph.

接入

五步搞定。

  1. 1.Install: pip install langchain-openai
  2. 2.Import ChatOpenAI from langchain_openai
  3. 3.Construct with base_url='https://api.orcarouter.ai/v1' and api_key='sk-orca-…'
  4. 4.Set the model to any OrcaRouter-supported model ID.
  5. 5.Use it anywhere LangChain expects a chat model — chains, agents, tools.
配置
from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
    base_url="https://api.orcarouter.ai/v1",
    api_key="sk-orca-...",
    model="claude-sonnet-4",
    temperature=0,
)

response = llm.invoke("Explain retrieval-augmented generation in one paragraph.")
print(response.content)
为什么把 LangChain 路由过 OrcaRouter?

LangChain agents do lots of short, bursty calls that are sensitive to rate limits. OrcaRouter spreads those calls across healthy providers automatically and gives you one cost-attribution view for the whole chain.

其他集成

立即把 LangChain 路由过 OrcaRouter。

一分钟注册,拿一把 sk-orca-… key,粘贴到 LangChain 里。token 零加价,跨所有 provider 自动 failover。

获取 API key →