The biggest model on your phone, honestly.
Not a quantized 70B. A compiled personal model that behaves like Hermes-class on the tasks you compiled it for, because it was distilled on those exact tasks, drafted by recipes that cover its structured outputs, and grounded in your own corpus. 3B base + LoRA + recipe pack + sqlite-vec, all running offline.
kolm.app is in build.
The CLI ships first; the cloud ships second; the boxed mobile app is the consumer endpoint.
Below is what it will look like when it lands.
Three Specialists out of the box.
Personal Assistant
Schedule, recall, summarize, draft. Grounded in your photos, voice memos, calendar, and inbox via on-device sqlite-vec.
Email Reply
Drafts replies in your voice. Distilled from your sent folder. Never auto-sends. Receipts visible on every draft.
Daily Recap
One paragraph at end of day, grounded in everything captured. Optional voice readout. Stays on device.
Or compile your own from your phone.
The "Compile a new one" button drives the same cloud pipeline as the CLI, using your on-device corpus
(photos, voice memos, screen captures) as the data source. The resulting .kolm downloads
back onto the phone and runs locally, no further API calls.