Broadcast

Stream AI request traces to your observability stack

Broadcast forwards request and generation telemetry from TheRouter.ai to external systems without adding application-side instrumentation. You can fan out to multiple destinations and control sampling, privacy mode, and API key filters per destination.

Enable broadcast

  1. Open Settings → Observability in the TheRouter.ai dashboard.
  2. Toggle Broadcast on for your account or organization.
  3. Add and test one or more destinations.
  4. Optionally enforce privacy mode for sensitive workloads.
request-with-trace-metadata.json
{
  "model": "anthropic/claude-sonnet-4.5",
  "messages": [{ "role": "user", "content": "Summarize this incident report." }],
  "user": "user_314",
  "session_id": "sess_2026_02_24",
  "trace": {
    "trace_id": "incident_review_001",
    "trace_name": "Incident Review Pipeline",
    "generation_name": "Summarize Incident"
  }
}

Supported destinations

Arize, Braintrust, ClickHouse, Opik, Datadog, Grafana, Langfuse, LangSmith, New Relic, OTel Collector, PostHog, S3, Sentry, Snowflake, Weave, and generic Webhook.

cURL
curl https://api.therouter.ai/v1/chat/completions   -H "Authorization: Bearer $THEROUTER_API_KEY"   -H "Content-Type: application/json"   -H "X-TheRouter.ai-Broadcast: true"   -d '{
    "model": "openai/gpt-4.1",
    "messages": [{"role": "user", "content": "Hello"}],
    "trace": {"trace_name": "Broadcast Smoke Test"}
  }'
Use privacy mode per destination
Privacy mode strips prompt and completion content while preserving token usage, costs, timing, model metadata, and custom trace fields.