kolm / learn
Learn kolm.
Three step-by-step Golden Paths to a working .kolm artifact. Five minutes each. Build, distill, and deploy your first private AI.
golden path 1 . start here
First .kolm in 5 minutes.
Install the CLI, build a signed PII redactor, run it offline, eject and audit. No data leaves your machine. End with one .kolm file on your laptop.
golden path 2 . leave the API
Distill your OpenAI usage into a .kolm.
Capture prompt-response pairs from your live OpenAI traffic, distill them into a seed set, compile a specialist .kolm that beats the original cost by 10× while running on your own hardware.
golden path 3 . ship to a device
Deploy a .kolm to a phone.
Export the same artifact to iPhone (CoreML), Android (ONNX Mobile), or Apple Silicon (MLX). Forecast size, latency, and fit before shipping. Same K-score, smaller binary.
Companion reading
spec
The .kolm format (RS-1)
What is inside a .kolm. Manifest, recipes, evals, signature chain.
architecture
Codebase in one sitting
Every module in the shipped runtime, with line counts and trust boundaries.
browse
Public artifact hub
Verified .kolm files to fork. Healthcare, legal, finance, code, edge.
Migration guides
migrating
From OpenPipe
Replace OpenPipe's hosted fine-tunes with one local .kolm. No SaaS bill.
migrating
From Predibase
Predibase adapters become standalone signed artifacts you own.
migrating
From raw LoRA
Wrap your existing PEFT adapter in a .kolm with a verifiable K-score and receipt chain.
Cookbook
Twenty short recipes for things .kolm is uniquely good at. PHI intake triage, legal clause extraction, support ticket classification, code review, voice transcription, multilingual greeters, regulatory keyword flagging, and more. Each recipe ships with a runnable spec, seed set, and a target K-score.