Transparent Routing
Know Exactly Where Your Request Goes
Every response tells you which provider handled your request. Direct connections only — no aggregators, no middlemen, no black boxes.
The Industry Has a Trust Problem
Most API gateways route through chains of aggregators. You never know who actually served your request.
Typical API Proxy
Your App
Proxyunknown
Aggregatorcached?
Sub-aggregatorwhich model?
???
Provider
TheRouter
Your App
TheRouterverified
Provider (Direct API)x-router headers
Three Pillars of Transparency
0
aggregators
Direct Connections Only
We connect to Anthropic, OpenAI, Google, AWS Bedrock directly. No middlemen reselling API access.
- Every provider is a first-party API connection
- No resold, cached, or proxied responses
- Your data touches only you and the model vendor
5
x-router-* headers
Verifiable Response Headers
Every API response carries headers that tell you exactly what happened: which model, which provider, whether a fallback fired, and how long the upstream took.
- x-router-model: the model that served your request
- x-router-provider: the verified direct provider
- x-router-fallback: whether a fallback was used
100%
requests traced
Visual Routing Trace
Open any request in your dashboard and see the full routing graph: which providers were tried, which was selected, timing for each attempt, and why.
- Flow graph: model → candidates → selected provider
- Per-attempt timing and failure reasons
- Full audit trail stored for every request
See It in Action
Every response carries routing proof. Here’s what a real API call looks like:
terminal
$ curl -s -D- https://api.therouter.ai/v1/chat/completions \ -H "Authorization: Bearer sk-your-key" \ -H "Content-Type: application/json" \ -d '{"model":"anthropic/claude-sonnet-4.6","messages":[{"role":"user","content":"Hi"}]}'HTTP/2 200x-router-request-id: req_a1b2c3d4e5x-router-model: anthropic/claude-sonnet-4.6x-router-provider: Anthropicx-router-fallback: falsex-router-latency-ms: 847{"id":"chatcmpl-...","model":"anthropic/claude-sonnet-4.6",...}
Start building in 3 lines of code
import OpenAI from "openai";
const client = new OpenAI({
baseURL: "https://api.therouter.ai/v1",
apiKey: "sk-your-key",
});
const response = await client.chat.completions.create({
model: "anthropic/claude-sonnet-4.6",
messages: [{ role: "user", content: "Hello!" }],
});