Integrations · VS Code

Run your .kolm inside VS Code in 60 seconds.

One command writes the MCP server entry. Restart VS Code. Your compiled AI shows up as a tool the Copilot agent can call. The model runs on your machine. No API calls leave the box.

shipped MCP · stdio macOS · Linux · Windows one command

Step 01

Install the integration.

One command. The installer detects VS Code, writes the MCP server entry to the user-scope mcp.json config, and pins the kolm binary by absolute path. Re-running is idempotent.

kolm install vscode --apply

Step 02

Restart VS Code.

VS Code (1.99+) reads the user MCP config at startup. After restart your compiled model shows up as an MCP tool inside Copilot Chat's agent panel. No further configuration. No keys to paste.

# macOS
osascript -e 'quit app "Visual Studio Code"' && open -a "Visual Studio Code"
# Linux / Windows
# just close and reopen VS Code

Step 03

Use it. The model runs locally.

Ask Copilot Chat (agent mode) to call your kolm tool. The recipe runs in-process. Zero outbound API calls. The same byte-exact .kolm artifact you compiled on the laptop is what the editor invokes.

# in Copilot Chat (agent mode)
> #kolm.run support-triage classify this ticket
> #kolm.run pr-review score this PR

What lands in the VS Code MCP config

Open the file after running the installer. This is the entire change. No telemetry. No phone-home. The command is the absolute path of the kolm binary the installer detected, so the entry stays valid even if your PATH changes.

user-scope mcp.json (Settings · MCP: User Settings)

{
  "servers": {
    "kolm": {
      "type": "stdio",
      "command": "/usr/local/bin/kolm",
      "args": ["serve", "--mcp"],
      "env": {
        "KOLM_HOME": "~/.kolm"
      }
    }
  }
}

What you get.

Local execution.

The MCP server runs in-process over stdio. No daemon. No HTTP unless you opt in with --http on port 8765.

Zero outbound calls.

VS Code talks to the kolm process. The kolm process executes the recipe against your local model. Nothing leaves the machine.

Reproducible.

Every tool call ships _kolm metadata (artifact hash, recipe id, K-score). VS Code logs it. You can replay any call.

Idempotent install.

Re-running kolm install vscode --apply updates the entry in place. No duplicate servers. No drift.

Works offline.

Once the .kolm is on disk, the integration runs with no internet. Useful on a plane. Useful in a locked-down VPC.

Dry-run first.

Drop the --apply and the installer prints what it would write. Audit it. Then re-run with --apply.

Ready to compile.

If you don't have the CLI yet, the quickstart is one page. If you want the MCP surface itself, the spec page lists every tool the kolm server exposes.