Tool Calling
Give models structured tools and execute calls in your application loop
TheRouter.ai normalizes tool calling across providers. You pass OpenAI-compatible tool definitions, the model returns `tool_calls`, then your app executes the tool and sends results back in a follow-up completion.
Tool definition and choices
tool-call-request.json
{
"model": "openai/gpt-4.1",
"messages": [{ "role": "user", "content": "Check weather in Tokyo and Seoul." }],
"tools": [
{
"type": "function",
"function": {
"name": "get_weather",
"description": "Get current weather by city",
"parameters": {
"type": "object",
"properties": { "city": { "type": "string" } },
"required": ["city"]
}
}
}
],
"tool_choice": "auto",
"parallel_tool_calls": true
}Execution loop
TypeScript
const first = await client.chat.completions.create(request);
const message = first.choices[0]?.message;
if (message?.tool_calls?.length) {
const toolMessages = await Promise.all(
message.tool_calls.map(async (call) => {
const args = JSON.parse(call.function.arguments || "{}");
const result = await runTool(call.function.name, args);
return {
role: "tool" as const,
tool_call_id: call.id,
content: JSON.stringify(result),
};
})
);
const second = await client.chat.completions.create({
...request,
messages: [...request.messages, message, ...toolMessages],
});
console.log(second.choices[0]?.message?.content);
}Support matrix guidance
Use the models catalog and filter for `tools` support before enabling forced tool-choice behavior. Some models support tool calls but not parallel calls.
Key parameters
Use
tool_choice: "auto" to let the model decide, "none" to disable tool use, or force a specific function by name when deterministic behavior is required.