A compiled .kolm answers today's task. kolm tune and kolm rag are what make it answer tomorrow's: every production run becomes a training signal, every promotion goes through the same K-score gate that gated the compile, and every lookup hits a local BM25 index that knows your policies, your runbooks, your wiki. The base model never moves. Nothing phones home.
Six verbs are the whole pipeline. init writes a zero-init LoRA so the cold start is identity to the base. capture-on tells the runner to record every successful call. step trains a candidate. eval recomputes K-score. promote flips HEAD only if the candidate clears the gate. rollback swaps HEAD with the snapshot. Each step is local, deterministic, logged.
Skeleton LoRA at rank 8, alpha 16. Behaves exactly like the base on day one. Cold start is identity, not surprise.
Every successful kolm run writes (input, output, recipe, latency_us) to captures.jsonl under ~/.kolm/tune/<slug>/. No upload. No network code touches the file. Failed runs are skipped, bench runs are skipped.
--airgap sets TRANSFORMERS_OFFLINE=1, HF_HUB_OFFLINE=1, HF_DATASETS_OFFLINE=1 and refuses any base_model that is not a local existing path. Output: a new revision directory revisions/v(n+1)/.
Same formula that gated the compile: K = 0.40·A + 0.15·(S+L+C+V). The eval set is the artifact's own held-out tests. One ruler.
HEAD flips to the candidate only if K-score(v1) ≥ 0.85 and (with require_improvement) K-score(v1) ≥ K-score(current HEAD). Refusal logs gate_blocked and exits 2 with code K_GATE.
HEAD swaps with head.prev. No retraining, no data loss. If a promotion turns out to underperform in production runs, the prior revision is one command away.
"Airgapped" is not a marketing word here. Every claim below is a line of code your security team can grep for. The orchestrator imports no fetch and no http. The trainer refuses non-local base paths. The bench harness patches network primitives.
The orchestrator sets KOLM_AIRGAP=1, TRANSFORMERS_OFFLINE=1, HF_HUB_OFFLINE=1, HF_DATASETS_OFFLINE=1 in the Python env dict. See src/tune.js:runTuneStepWith.
Before any model load, the trainer checks base_model. If KOLM_AIRGAP=1 and the path contains :// or does not exist, tune-step.py exits with a clean message. No partial load. No DNS lookup.
The JS orchestrator only spawns Python and reads stdout. There is no fetch, no http import in src/tune.js. The only IO is local file IO under ~/.kolm/tune/.
The bench harness patches fetch, http, https, net, tls, dns. Any recipe that tries to call out during eval fails the benchmark, which fails the K-score, which fails promotion.
The capture appender in src/tune.js:appendCapture is a pure file-append. No analytics. No telemetry. The training data physically cannot leave the box.
Promotion writes v1 into HEAD; the prior content moves to head.prev. Rollback swaps them. No retraining, no irrecoverable state.
Some facts change too often to bake into weights. Policies. Prices. Schemas. Runbooks. kolm rag ships the simplest thing that actually works: a deterministic, single-file BM25 index built in pure JavaScript. No embedder. No network at index time, query time, or attach time. The whole subsystem fits in ~250 lines.
Walk a directory deterministically, tokenize (lowercase, strip non-alphanumeric, drop stopwords), build an inverted index, write index.json + manifest.json under ~/.kolm/rag/<name>/.
$ kolm rag index ./docs --name internal-docs ok indexed 247 docs name: internal-docs avgdl: 1195 tokens/doc size: 115.8 KB
Attaching writes a sidecar (<art>.kolm.rag.json) next to the artifact. The artifact itself is not re-signed: the receipt stays byte-stable. The runner reads the sidecar at execution time and injects lib.rag.query() into the recipe sandbox.
$ kolm rag attach redactor.kolm --index internal-docs ok wrote redactor.kolm.rag.json // inside the recipe: var hits = lib.rag.query(q, 5).matches;
A daemon polls captures.jsonl. When the row count crosses threshold_rows (default 200), it auto-runs step, then eval, then gated promote. Every state change is one JSON line in ~/.kolm/logs/tune.jsonl. The gate refuses bad candidates without your help. The audit trail writes itself.
The .kolm file is signed at compile-time and never modified by the evolve loop. Every piece of evolving state lives in sidecar paths the runner reads at execution time. That is how the model evolves while the artifact's cryptographic receipt stays stable.