Integrations · Continue
Run task-compiled AI inside Continue (VS Code).
One command writes the kolm model entry into Continue's config.json. Reload the editor. Your compiled recipe shows up in the Continue model picker alongside whatever else you have set up. Local. Reproducible. Offline-capable.
Step 01
Install the integration.
The installer locates ~/.continue/config.json and adds a kolm model entry plus an MCP server entry. Existing models in your config stay in place. Re-running is idempotent.
kolm install continue --apply
Step 02
Reload the editor.
Continue watches ~/.continue/config.json and hot-reloads on save. If your editor doesn't pick it up, run the Continue: Reload Window command. The kolm model appears in the picker.
# in VS Code > Continue: Reload Window # or just quit and relaunch the editor
Step 03
Pick the kolm model.
Open the Continue panel. Pick the kolm model from the dropdown. Continue routes your prompts through the kolm MCP server. Recipes execute locally against the .kolm artifact. Zero outbound API calls on the recipe path.
# click the model picker in Continue # select "kolm (your-recipe-id)" # ask anything; the recipe runs locally
What lands in ~/.continue/config.json
Two entries: a model the picker can select, and an MCP server the agent can call as a tool. Continue keeps everything else in your config untouched. The command is the absolute path of the kolm binary, so PATH changes won't break the setup.
~/.continue/config.json
{
"models": [
{
"title": "kolm (local)",
"provider": "openai",
"model": "kolm",
"apiBase": "http://127.0.0.1:8765/v1",
"apiKey": "local"
}
],
"experimental": {
"modelContextProtocolServers": [
{
"transport": {
"type": "stdio",
"command": "/usr/local/bin/kolm",
"args": ["serve", "--mcp"]
}
}
]
}
}
What you get.
Model picker entry.
The kolm recipe shows up in the same dropdown as your hosted models. Switching is one click. Continue calls it over the OpenAI-compatible local endpoint.
MCP tool surface.
Continue agents can call kolm.run, kolm.compile, kolm.inspect, and kolm.query directly. Same tools as Claude Code and Cursor.
VS Code and JetBrains.
The Continue extension covers both. Same config file. Same install command. No JetBrains-specific setup.
Local OpenAI endpoint.
The kolm process binds to 127.0.0.1:8765 with an OpenAI-compatible /v1 shape. Useful if you want to point other tools at it.
Hot reload.
Continue watches the config file. After install you usually don't need to restart at all. The model just appears.
Dry-run first.
Without --apply the installer prints the JSON it would merge into your config. Audit it. Then re-run with the flag.
Ready to compile.
If the CLI isn't on your machine yet, the quickstart compiles your first recipe in one page. The spec page is the canonical reference for the MCP surface kolm exposes.