통합 · 60초 안에 연결 · 추가 마진 없음
LangChain + OrcaRouter
LangChain's ChatOpenAI class takes a base_url parameter. Pointing it at OrcaRouter gives every chain, agent, and retriever automatic failover and zero-markup pricing without touching the rest of your graph.
연결
다섯 단계로 완료.
- 1.Install: pip install langchain-openai
- 2.Import ChatOpenAI from langchain_openai
- 3.Construct with base_url='https://api.orcarouter.ai/v1' and api_key='sk-orca-…'
- 4.Set the model to any OrcaRouter-supported model ID.
- 5.Use it anywhere LangChain expects a chat model — chains, agents, tools.
설정
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
base_url="https://api.orcarouter.ai/v1",
api_key="sk-orca-...",
model="claude-sonnet-4",
temperature=0,
)
response = llm.invoke("Explain retrieval-augmented generation in one paragraph.")
print(response.content)LangChain을(를) OrcaRouter로 라우팅하는 이유는 무엇입니까?
LangChain agents do lots of short, bursty calls that are sensitive to rate limits. OrcaRouter spreads those calls across healthy providers automatically and gives you one cost-attribution view for the whole chain.
기타 통합
지금 LangChain을(를) OrcaRouter로 라우팅하십시오.
1분 만에 가입하여 sk-orca-… key를 받아 LangChain에 붙여넣으십시오. token 추가 마진 없이 모든 공급자에서 자동 failover가 지원됩니다.
