Langfuse

Capture production LLM traces and evaluation context in Langfuse

Langfuse is a strong default for prompt and trace management. TheRouter.ai sends trace hierarchies, user/session fields, and cost/token metrics into your Langfuse project.

broadcast-langfuse-config.json
{
  "destination": "langfuse",
  "enabled": true,
  "secret_key": "sk-lf-...",
  "public_key": "pk-lf-...",
  "base_url": "https://us.cloud.langfuse.com",
  "sampling_rate": 1
}
cURL
curl https://api.therouter.ai/v1/chat/completions   -H "Authorization: Bearer $THEROUTER_API_KEY"   -H "Content-Type: application/json"   -H "X-TheRouter.ai-Broadcast: true"   -d '{
    "model":"openai/gpt-4.1",
    "messages":[{"role":"user","content":"Summarize this transcript"}],
    "session_id":"sess_1001",
    "trace":{"trace_name":"Langfuse Smoke Test","generation_name":"Summary Step"}
  }'
Verification
Search Langfuse traces by `trace_name` and confirm the nested generation entry has model, latency, and usage metadata.