Tool calling

Let models call functions in your application.

Tool calling lets models invoke functions you define. Gateway passes your tool definitions to the model, the model suggests which tool to call with what arguments, and your application executes it and sends back the result. This works identically across all providers — Gateway handles format translation.

Define tools

Tools are defined as function objects with a name, description, and a JSON Schema for the parameters.

1{
2 "type": "function",
3 "name": "get_weather",
4 "description": "Get the current weather for a location.",
5 "parameters": {
6 "type": "object",
7 "properties": {
8 "location": { "type": "string", "description": "City name" }
9 },
10 "required": ["location"]
11 }
12}

Send a request with tools

1from merge_gateway import MergeGateway
2
3client = MergeGateway(api_key="YOUR_API_KEY")
4
5tools = [
6 {
7 "type": "function",
8 "name": "get_weather",
9 "description": "Get the current weather for a location.",
10 "parameters": {
11 "type": "object",
12 "properties": {
13 "location": {"type": "string", "description": "City name"},
14 },
15 "required": ["location"],
16 },
17 }
18]
19
20response = client.responses.create(
21 model="openai/gpt-5.1",
22 input=[
23 {"type": "message", "role": "user", "content": "What's the weather in San Francisco?"},
24 ],
25 tools=tools,
26 tool_choice="auto",
27)

Handle the response

When the model calls a tool, the response contains a tool_use content block with finish_reason: "tool_use".

1{
2 "output": [
3 {
4 "role": "assistant",
5 "finish_reason": "tool_use",
6 "content": [
7 {
8 "type": "tool_use",
9 "id": "call_abc123",
10 "name": "get_weather",
11 "input": { "location": "San Francisco" }
12 }
13 ]
14 }
15 ]
16}

Send tool results

After executing the function, send the result back with a tool_result input to continue the conversation.

Python
1# 1. Extract the tool call from the response
2tool_call = response.output[0].content[0]
3
4# 2. Execute your function
5weather_data = get_weather(tool_call.input["location"])
6
7# 3. Send the result back
8follow_up = client.responses.create(
9 model="openai/gpt-5.1",
10 input=[
11 {"type": "message", "role": "user", "content": "What's the weather in San Francisco?"},
12 {"type": "message", "role": "assistant", "content": [
13 {"type": "tool_use", "id": tool_call.id, "name": tool_call.name, "input": tool_call.input},
14 ]},
15 {"type": "tool_result", "tool_use_id": tool_call.id, "content": weather_data},
16 ],
17 tools=tools,
18)
19
20print(follow_up.output[0].content[0].text)

Tool choice

Control whether and how the model uses tools.

ValueBehavior
"auto"Model decides whether to call a tool (default)
"none"Model will not call any tools
"required"Model must call at least one tool
{"type": "function", "function": {"name": "get_weather"}}Model must call the specified tool

OpenAI SDK

Tool calling works through the OpenAI SDK with a simple update of changing the base url and API key.

OpenAI SDK
1from openai import OpenAI
2
3client = OpenAI(
4 api_key="YOUR_API_KEY",
5 base_url="https://api-gateway.merge.dev/v1/openai",
6)
7
8response = client.chat.completions.create(
9 model="gpt-5.1",
10 messages=[
11 {"role": "user", "content": "What's the weather in San Francisco?"},
12 ],
13 tools=[
14 {
15 "type": "function",
16 "function": {
17 "name": "get_weather",
18 "description": "Get the current weather for a location.",
19 "parameters": {
20 "type": "object",
21 "properties": {
22 "location": {"type": "string", "description": "City name"},
23 },
24 "required": ["location"],
25 },
26 },
27 }
28 ],
29 tool_choice="auto",
30)

FAQ

Any model with capabilities.tools: true supports tool calling. Check via GET /v1/models or the Gateway dashboard. Most recent models from OpenAI, Anthropic, and Google support it.

Yes. The response may contain multiple tool_use content blocks. Send a tool_result for each one before continuing the conversation.

No. Tool parameter schemas are passed directly to the provider. Validation is handled by the model and your application.