Sentry
Add LLM traces to Sentry performance monitoring
Sentry broadcast sends OTLP spans with model usage and finish-reason metadata. This is useful for correlating AI events with application errors and transaction traces.
broadcast-sentry-config.json
{
"destination": "sentry",
"enabled": true,
"otlp_traces_endpoint": "https://o123.ingest.us.sentry.io/api/456/integration/otlp/v1/traces",
"dsn": "https://abc123@o123.ingest.us.sentry.io/456",
"sampling_rate": 1
}cURL
curl https://api.therouter.ai/v1/chat/completions -H "Authorization: Bearer $THEROUTER_API_KEY" -H "Content-Type: application/json" -H "X-TheRouter.ai-Broadcast: true" -d '{
"model":"anthropic/claude-sonnet-4.5",
"messages":[{"role":"user","content":"Debug this stack trace"}],
"trace":{"trace_name":"Sentry Trace Test","release":"v2.1.0"}
}'Verification
In Sentry Performance, confirm a new transaction/span exists and includes your custom trace metadata keys.