A private Context OS that runs ambient in the background, distilling your screen and audio into a structured knowledge graph of facts, entities, and decisions — accessible from MCP, a web UI, and a screen-recording-invisible HUD overlay, plus peer-to-peer sharing between users.
npx @geravant/sinain@latest start
One knowledge graph, four uses. Sinain extracts atomic facts, entities, and decisions with an LLM — then integrates them through deterministic graph operations.
<private> tag stripping and auto-redaction of credentials and
tokens happen before anything leaves your machine.
Atomic claims, people / projects / topics, what was chosen and why. An LLM extracts — a non-LLM pipeline integrates. Moving the integration step off the LLM was the single biggest gain: 37.9% → 62.7% recall on internal Meeting Memory, 82.8% IPR on LongMemEval (ICLR 2025 public benchmark).
Agents read it via MCP. You browse it at localhost:9500/knowledge/ui/.
You glance at it through a HUD that is invisible to screen capture.
Send a slice of your graph to another person. Their machine pulls from yours directly over WebRTC; the data itself never touches a server. Their AI agents inherit the context via the same MCP.
Sinain feeds the same screen and audio context to any MCP-compatible agent. Switch on the fly — no restart, no context loss.
By default, sinain uses cloud APIs (OpenRouter) for transcription and analysis. Need tighter control? Flip the mode.
The HUD never appears in screenshots, recordings, or screen shares.
The HUD overlay never appears in screenshots, recordings, or screen shares.
off — all data flows freely; maximum insight quality.standard — auto-redacts credentials before cloud APIs (wizard default).strict — only summaries leave your machine; no raw text sent to cloud.paranoid — Ollama + whisper.cpp. Zero network calls at runtime.All data flows freely — maximum insight quality.
Auto-redacts credentials before cloud APIs.
Only summaries leave your machine — no raw text sent to cloud.
Ollama + whisper.cpp. Zero network calls at runtime.
On first run, sinain runs an interactive setup wizard, auto-downloads the assets it needs (~112 MB total), and starts every service for you.
Pick transcription backend, paste an API key (or skip for fully local), choose your agent, set privacy mode.
Overlay app (~17 MB), sck-capture binary (~5 MB), embedding model (~90 MB), Python deps — all cached locally.
sinain-core, sense_client, overlay, agent — running. HUD appears, screen and audio capture begins.
npx @geravant/sinain@latest start --setup. Stop / status / no-sense / headless flags available.
sinain_* tools, one MCP install.The wizard detects which MCP agents you have installed and registers sinain for each one you select.
npx @geravant/sinain@latest mcp install
Re-run any time to update. Registers sinain with Claude Code, Claude Desktop, Cursor, Codex, Goose, and Junie.
Everything that didn't fit on this page. Each link goes to the canonical doc in the repo.
One install. macOS 12.3+. MIT-licensed. Private by default.