@tomdra01/kntxt
v0.2.1
Published
A peer-to-peer terminal app for paired AI Agent coding sessions with AI summaries
Downloads
411
Maintainers
Readme
Kntxt
A peer-to-peer terminal app for paired AI Agent coding sessions with AI summaries.
The Problem
When two developers code on the same project from different laptops — especially using AI agents like Cursor or Claude Code — things get messy fast.
Your AI agent has no idea what your partner's AI agent is doing until someone finally pushes to Git. This context gap means agents routinely hallucinate or overwrite each other's work, because they're missing the why behind the changes happening on the other screen.
What is Kntxt?
Kntxt bridges that gap during paired AI agent coding sessions. It gives your AI a live feed of what your partner is doing in real-time — sharing the reasoning and intent behind every change, not just the diff.
It's not version control. It's about making sure your AI understands your partner's AI, so they can actually work together instead of against each other.
How It Works
You save a file
└─► Local AI reads the diff and generates intent ("Refactoring auth flow")
└─► Intent is broadcast over an encrypted P2P tunnel
└─► Partner's AI receives the context update in real-time
└─► Partner's trackpad taps to signal presenceKey Features
Real-Time Intent Broadcasting
A parser reads the "thoughts" of AI tools directly and broadcasts 100% accurate summaries of what your partner is trying to achieve — so your own AI stays in the loop without waiting for a commit.
Local AI Brain
A specialized LLM runs entirely on your machine. It analyzes live code diffs to explain the why behind every save. Your data stays private; your context stays deep.
Knowledge Graph Radar
A live, force-directed visual map showing how your work and your partner's work are converging — a bird's-eye view of the entire session as it evolves.
Zero-Server P2P
No accounts. No cloud. No lag. Built with Iroh to create a direct, encrypted tunnel between laptops using just a 4-digit session code.
Haptic Presence
macOS trackpad taps whenever your partner saves a file — you feel the collaboration without lifting your eyes from the screen.
Quick Start
npm install -g @tomdra01/kntxt
kntxtmacOS only for haptic feedback. Compiles on Linux/Windows but trackpad taps are no-ops. First run downloads the AI model (~300 MB) to
~/.cache/huggingface/— takes about a minute.
Connecting in 4 Steps
| Step | Host | Peer | | ---- | ---- | ---- | | 1 | Pick a character | Pick a character | | 2 | Choose Host → note the 4-digit code | Choose Join | | 3 | Share code out-of-band (Slack, voice, anything) | Enter the code | | 4 | Wait for the connection animation | — |
That's it. Type to chat. Save a file — your peer sees the AI summary and feels a tap.
Build from Source
git clone https://github.com/mirekondro/kntxt.git
cd kntxt
cargo run --release # requires Rust nightly (edition 2024)Configuration
All configuration is via environment variables — no config file needed.
| Variable | Default | Description |
| -------- | ------- | ----------- |
| KNTXT_WATCH_EXTS | rs,md,ts,py,js,toml,go,json | File types to watch |
| KNTXT_WATCH_PATH | . | Directory to watch |
| KNTXT_MODEL_REPO | bartowski/Qwen2.5-0.5B-Instruct-GGUF | HuggingFace model repo |
| KNTXT_MODEL_FILE | Qwen2.5-0.5B-Instruct-Q4_K_M.gguf | GGUF model filename |
| KNTXT_DEBUG | (unset) | Show debug log panel |
Example — watch only TypeScript with a larger model:
KNTXT_WATCH_EXTS=ts,tsx \
KNTXT_MODEL_REPO=your-org/your-model \
KNTXT_MODEL_FILE=model-q4_k_m.gguf \
kntxtKeyboard Shortcuts
| Key | Action |
|-----|--------|
| Enter | Send message / confirm selection |
| ↑ / ↓ | Scroll chat |
| PgUp / PgDn | Scroll chat by 10 lines |
| Ctrl+P | Ping peer (flashes on radar) |
| Esc | Go back |
| Ctrl+C | Quit |
In character selection: ← / → to cycle, Enter to confirm.
From main menu: 1 host · 2 join · 3 character · 4 quit.
Tech Stack
- Rust — core runtime and systems layer
- Ratatui — terminal UI framework
- Iroh — peer-to-peer networking (encrypted, serverless)
- llama.cpp / GGUF — local LLM inference
Docs
| Document | What's inside | |----------|---------------| | docs/architecture.md | How the pieces fit together | | docs/modules.md | Per-module technical reference | | docs/configuration.md | Full configuration & extension guide | | docs/story.md | What Kntxt is (plain language) |
