Intégration · Configuration en 60 secondes · Sans majoration
LangChain + OrcaRouter
LangChain's ChatOpenAI class takes a base_url parameter. Pointing it at OrcaRouter gives every chain, agent, and retriever automatic failover and zero-markup pricing without touching the rest of your graph.
Configuration
Cinq étapes.
- 1.Install: pip install langchain-openai
- 2.Import ChatOpenAI from langchain_openai
- 3.Construct with base_url='https://api.orcarouter.ai/v1' and api_key='sk-orca-…'
- 4.Set the model to any OrcaRouter-supported model ID.
- 5.Use it anywhere LangChain expects a chat model — chains, agents, tools.
Configuration
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
base_url="https://api.orcarouter.ai/v1",
api_key="sk-orca-...",
model="claude-sonnet-4",
temperature=0,
)
response = llm.invoke("Explain retrieval-augmented generation in one paragraph.")
print(response.content)Pourquoi router LangChain via OrcaRouter ?
LangChain agents do lots of short, bursty calls that are sensitive to rate limits. OrcaRouter spreads those calls across healthy providers automatically and gives you one cost-attribution view for the whole chain.
Autres intégrations
Routez LangChain via OrcaRouter dès aujourd'hui.
Inscrivez-vous en moins d'une minute, récupérez une clé sk-orca-… et collez-la dans LangChain. Aucune majoration sur les tokens. Basculement automatique entre tous les providers.
