Tool calling

Let models call functions in your application.

Tool calling lets models invoke functions you define. Gateway passes your tool definitions to the model, the model suggests which tool to call with what arguments, and your application executes it and sends back the result. This works consistently across vendors, with capability checks applied to the exact execution route before the request is sent upstream.

Define tools

Tools are defined as function objects with a name, description, and a JSON Schema for the parameters.

1{
2 "type": "function",
3 "name": "get_weather",
4 "description": "Get the current weather for a location.",
5 "parameters": {
6 "type": "object",
7 "properties": {
8 "location": { "type": "string", "description": "City name" }
9 },
10 "required": ["location"]
11 }
12}

Send a request with tools

1from merge_gateway import MergeGateway
2
3client = MergeGateway(api_key="YOUR_API_KEY")
4
5tools = [
6 {
7 "type": "function",
8 "name": "get_weather",
9 "description": "Get the current weather for a location.",
10 "parameters": {
11 "type": "object",
12 "properties": {
13 "location": {"type": "string", "description": "City name"},
14 },
15 "required": ["location"],
16 },
17 }
18]
19
20response = client.responses.create(
21 model="openai/gpt-5.1",
22 input=[
23 {"type": "message", "role": "user", "content": "What's the weather in San Francisco?"},
24 ],
25 tools=tools,
26 tool_choice="auto",
27)

Handle the response

When the model calls a tool, the response contains a tool_use content block with finish_reason: "tool_use".

1{
2 "output": [
3 {
4 "role": "assistant",
5 "finish_reason": "tool_use",
6 "content": [
7 {
8 "type": "tool_use",
9 "id": "call_abc123",
10 "name": "get_weather",
11 "input": { "location": "San Francisco" }
12 }
13 ]
14 }
15 ]
16}

Send tool results

After executing the function, send the result back with a tool_result input to continue the conversation.

Python
1# 1. Extract the tool call from the response
2tool_call = response.output[0].content[0]
3
4# 2. Execute your function
5weather_data = get_weather(tool_call.input["location"])
6
7# 3. Send the result back
8follow_up = client.responses.create(
9 model="openai/gpt-5.1",
10 input=[
11 {"type": "message", "role": "user", "content": "What's the weather in San Francisco?"},
12 {"type": "message", "role": "assistant", "content": [
13 {"type": "tool_use", "id": tool_call.id, "name": tool_call.name, "input": tool_call.input},
14 ]},
15 {"type": "tool_result", "tool_use_id": tool_call.id, "content": weather_data},
16 ],
17 tools=tools,
18)
19
20print(follow_up.output[0].content[0].text)

Tool choice

Control whether and how the model uses tools.

ValueBehavior
"auto"Model decides whether to call a tool (default)
"none"Model will not call any tools
"required"Model must call at least one tool
{"type": "function", "function": {"name": "get_weather"}}Model must call the specified tool

OpenAI SDK

Tool calling works through the OpenAI SDK with a simple update of changing the base url and API key.

OpenAI SDK
1from openai import OpenAI
2
3client = OpenAI(
4 api_key="YOUR_API_KEY",
5 base_url="https://api-gateway.merge.dev/v1/openai",
6)
7
8response = client.chat.completions.create(
9 model="gpt-5.1",
10 messages=[
11 {"role": "user", "content": "What's the weather in San Francisco?"},
12 ],
13 tools=[
14 {
15 "type": "function",
16 "function": {
17 "name": "get_weather",
18 "description": "Get the current weather for a location.",
19 "parameters": {
20 "type": "object",
21 "properties": {
22 "location": {"type": "string", "description": "City name"},
23 },
24 "required": ["location"],
25 },
26 },
27 }
28 ],
29 tool_choice="auto",
30)

AI SDK (Vercel)

Tool calling works with the Vercel AI SDK using the tool() helper and Zod schemas.

AI SDK
1import { createOpenAI } from "@ai-sdk/openai";
2import { generateText, tool } from "ai";
3import { z } from "zod";
4
5const gateway = createOpenAI({
6 apiKey: "YOUR_API_KEY",
7 baseURL: "https://api-gateway.merge.dev/v1/ai-sdk",
8});
9
10const { text, toolResults } = await generateText({
11 model: gateway("openai/gpt-4o"),
12 prompt: "What's the weather in San Francisco?",
13 tools: {
14 getWeather: tool({
15 description: "Get the current weather for a location.",
16 parameters: z.object({
17 location: z.string().describe("City name"),
18 }),
19 execute: async ({ location }) => {
20 return { temperature: 72, condition: "sunny" };
21 },
22 }),
23 },
24});

FAQ

Use GET /v1/models and inspect vendors.<vendor>.capabilities.supports_tool_calling for the exact route you plan to use.

Yes. The response may contain multiple tool_use content blocks. Send a tool_result for each one before continuing the conversation.

No. Tool parameter schemas are passed directly to the provider. Validation is handled by the model and your application.

Capability checks

Gateway validates tool support against the exact vendor route that will serve the request. Use GET /v1/models and inspect vendors.<vendor>.capabilities.supports_tool_calling and vendors.<vendor>.capabilities.supports_tool_choice when deciding whether to send tools or a specific tool_choice value.