kolm  /  community  /  hn launch

Show HN: kolm - compile any frontier to a signed, verifiable artifact.

Template body below. Copy verbatim into the HN submission. Tweak the title only if HN flags the colon. After-post checklist below the body.

Post body.

word count 308 / paste into the comment field

Hi HN - I'm one of the founders of kolm (kolm.ai).

We built kolm because we kept hitting the same wall in shipping AI to regulated customers: every output from a closed-API model was unprovable. A hospital's compliance reviewer asks "what model produced this discharge summary on March 14 at 09:23?" and the honest answer was "the one OpenAI was hosting under gpt-4o at that time, which has since been updated." That is not a receipt you can defend.

kolm is a CLI and SDK that compiles a frontier model + a task + an evaluator + a compliance pack into a single signed .kolm artifact. The artifact is a zip with a canonical manifest, a CID (cidv1:sha256:hex of canonical-JSON), a signature, and an HMAC-SHA256 receipt chain. Every run emits a receipt over (cid, input_sha, output_sha, ts). Months later, anyone with the .kolm and the receipt can replay and verify.

A few engineering choices that matter:

- The verifier SDK is pure-stdlib. No jszip, no third-party deps. The thing whose job is verifying supply chain has no supply chain.
- 18 backends ship with the CLI: 6 local (CPU, CUDA, MPS, MLX, ROCm, DirectML), 8 remote (Modal, RunPod, Together, Vast, Lambda, Replicate, fal, SSH), 4 serving engines (vLLM, SGLang, TGI, TRT-LLM). The CLI picks the cheapest backend that meets your air-gap constraint.
- K-score is a 5-term weighted score (accuracy 0.40, safety 0.15, latency 0.15, cost 0.15, verifiability 0.15). The compile fails if you are below your k_min floor.

Numbers from our own runs: 7.42x faster local p50 vs the same task hitting OpenAI's API, 11.6x cheaper at sustained throughput. Both replayable from receipts.

Two things we'd love feedback on:

1. The RS-1 spec at kolm.ai/spec/rs-1. We are weeks from freezing v1.0 and we want the receipt-chain shape to be something other tools can adopt.

2. The compliance-pack model (HIPAA, SR 11-7, NIST AI RMF, EU AI Act). Are we naming and slicing these the way your auditor would want? kolm.ai/compliance.

Free tier is 50 compiles a month. founders@kolm.ai is the direct line.

After posting.

Five things in the first 90 minutes after the post goes live. Set a timer.

  • Reply to the first 10 comments within 1 hour. HN's algorithm rewards founder engagement in the first window. Short, specific replies. Acknowledge the criticism, give the number, link the page.
  • Share the permalink to X and LinkedIn. Once it's on the front page, drop the X thread (/community/twitter-thread) with the HN link in tweet 7.
  • Pin the post URL in /community. Update public/community.html card 03 with the actual permalink. Redeploy.
  • Archive the post permalink to /changelog. Add a one-line entry on the date with the HN URL. This is the canonical "what shipped, when" record.
  • Measure conversion via /signup?ref=hn. The signup form already reads the ref param. Check the dashboard at +1h, +6h, +24h. Capture the top three referring HN comment threads for the launch retro.

Tone notes.

HN sniffs out marketing. Three rules: lead with what we built and why, not what it could do; give one specific number with the methodology link; close with two questions that invite real engineering critique. No emojis, no hyperbole, no "we are excited to share" framing.