LangChain
LangChain + TheRouter.ai for Python and TypeScript
LangChain's OpenAI integrations can target TheRouter.ai by changing base URL and model IDs, enabling quick migration without chain rewrites.
Overview
This page mirrors the OpenRouter workflow and adapts it for TheRouter.ai. Use TheRouter.ai as your OpenAI-compatible endpoint and keep model IDs in `provider/model` format.
Installation
Install the required SDKs and keep your TheRouter.ai key in environment variables.
install.sh
npm install @langchain/openai @langchain/core && pip install langchain-openaiConfiguration
Set TheRouter.ai base URL overrides and pass your API key. Add attribution headers when your app should appear in rankings.
TypeScript
const llm = new ChatOpenAI({
model: "anthropic/claude-sonnet-4.5",
apiKey: process.env.THEROUTER_API_KEY,
configuration: { baseURL: "https://api.therouter.ai/v1" },
});Caveats
Integration note
LangChain retries can stack with TheRouter.ai fallback retries. Tune retry counts to avoid excessive tail latency.
For production rollouts, pin SDK versions and validate model compatibility in staging before broad rollout.