Continue.dev

Route Continue's chat, edit, and apply requests through Merge Gateway from VS Code or JetBrains.

Continue.dev accepts any OpenAI-compatible endpoint via a config.yaml model entry. Add one block pointed at Merge Gateway and every Gateway model (OpenAI, Anthropic, Google, Kimi, and more) becomes available across Continue’s chat, edit, and apply flows in VS Code or JetBrains.

Before you start

  • Grab an API key from gateway.merge.dev/settings/api-keys.
  • Decide which Continue roles (chat, edit, apply, summarize, autocomplete, embed, rerank) you want each model to serve.

Configure Continue

1

Open your Continue config

Open ~/.continue/config.yaml (create it if it doesn’t exist). In VS Code, you can also open it from the Continue side-panel → gear icon → Configure.

2

Add a Gateway model entry

Each Gateway model is one entry under models:. Use provider: openai (Continue’s OpenAI-compatible driver) with apiBase pointed at Gateway.

~/.continue/config.yaml
1name: My Config
2version: 0.0.1
3schema: v1
4models:
5 - name: Claude Opus 4.6 (Merge)
6 provider: openai
7 model: anthropic/claude-opus-4.6
8 apiBase: https://api-gateway.merge.dev/v1/openai
9 apiKey: ${{ secrets.MERGE_GATEWAY_API_KEY }}
10 roles:
11 - chat
12 - edit
13 - apply
14 capabilities:
15 - tool_use
16 - image_input
17
18 - name: GPT-5.2 (Merge)
19 provider: openai
20 model: openai/gpt-5.2
21 apiBase: https://api-gateway.merge.dev/v1/openai
22 apiKey: ${{ secrets.MERGE_GATEWAY_API_KEY }}
23 roles:
24 - chat
25 - edit
26 - apply
27 capabilities:
28 - tool_use
29 - image_input

Continue supports ${{ secrets.NAME }} interpolation. Set MERGE_GATEWAY_API_KEY in the Continue Hub secrets (or inline as a literal for local-only setups).

3

Reload Continue and select the model

Reload the Continue extension (VS Code: command palette → Developer: Reload Window; JetBrains: restart the IDE). Open the Continue panel, pick one of your new Gateway models, and send a test message. The request will show up in your Gateway dashboard.

Caveats

Continue’s roles list is a hint. Actual behavior depends on the model’s capabilities on the vendor route Gateway picks. Before assigning a model to autocomplete, confirm it’s a fast model (usually a smaller one) and that the vendor supports streaming.

Add tool_use to capabilities only if vendors.<vendor>.capabilities.supports_tool_calling is true for the route Gateway will use. Check via GET /v1/models.

You can wire dedicated models for autocomplete (e.g. a small fast model), embed, or rerank as separate entries. Continue will use each model for the role it’s assigned to. Gateway’s /v1/openai/embeddings endpoint is available for the embed role.

If you need to pin a routing project or attach tags, use Continue’s requestOptions.headers to send X-Project-Id or X-Merge-Tags.

1 requestOptions:
2 headers:
3 X-Project-Id: prj_abc123

Next steps