v0.1.0 · alpha · local-only

A local memory palace. For everything you scattered.

SuperTon ingests your notes, PDFs, and AI-chat transcripts into a verbatim store, then a tiny local model answers your questions — grounded only in what you've fed it. Nothing leaves your laptop.

GitHub →

Two commands. No cloud account. No telemetry.

01Install SuperTon
$uv tool install 'git+https://github.com/therahul-yo/Superton.git'copy
Requires uv. Or clone the repo and run uv run superton.
02Initialize the palace
$superton initcopy
Picks a model profile, pulls Ollama if missing, builds Miniton.
+Optional · install Ollama yourself
$brew install ollamacopy

Ingest. Ask. Cite.

~/superton · miniton

Three opinions, kept honest.

I.

Verbatim beats summarized.

Drawers store the original text. A summary loses the half-remembered phrase you'll actually search for in six months.

II.

Refuse over confabulate.

When the palace is silent, Miniton says so — and suggests the closest files. No invented dates, names, or numbers.

III.

Local until proven otherwise.

No API keys. No telemetry. No SaaS dependency. The palace lives on your laptop; the model lives in your Ollama.

drawers in, nothing forgotten.
github · docs · apache-2.0