@tjamescouch/gtui
v0.2.7
Published
Terminal UI for gro, the provider-agnostic LLM runtime
Maintainers
Readme
gtui 👾
gtui is the high-performance Terminal User Interface (TUI) for gro. It provides a dedicated, low-latency environment for managing autonomous agents, monitoring tool execution, and visualizing the "Virtual Memory" systems of your LLM runtime.
Visualizing the context: Persistent agents in a dark-mode world.
Features
- Real-time Stream Monitoring: View live token generation and tool calls with the high-fidelity logging you expect from
gro --verbose. - Dynamic Context Visualization: Track active memory pages, importance weights, and "Virtual Memory" state in a clean, terminal-native layout.
- Provider Switching: Fast-toggle between Anthropic, OpenAI, xAI, and local models.
- Resource Intensity Control: Visual feedback for the
@@thinking(0.0-1.0)@@lever—see exactly how much compute your agent is burning. - MCP Integration: A dedicated panel for connected MCP servers and available tools.
Installation
# Clone the repo
git clone https://github.com/tjamescouch/gtui.git
cd gtui
# Install dependencies
npm install
# Link for global usage
npm link
Usage
Launch gtui to wrap your existing gro sessions:
gtui --model grok-fast
Keybindings
Ctrl + T: Toggle thinking intensityCtrl + M: Cycle memory modes (Virtual/HNSW/Fragment)Ctrl + L: Clear active viewportTab: Switch between chat and tool-logs
Architecture
gtui is built to be as efficient as the runtime it manages. It utilizes:
- Blessed / Ink: For the terminal rendering engine.
- Unix Domain Sockets / IPC: For zero-latency communication with the
grobackground process. - Metal Performance Shaders (MPS): (Experimental) Local visualization hooks for Mac Studio.
Security
gtui spawns gro as a subprocess with --bash enabled. The LLM has shell access, file read/write, and any configured MCP tools. gro's built-in tool approval prompts do not apply in this mode (no TTY — gro runs as a piped subprocess).
Recommendations:
- Do not run gtui with MCP servers that provide network access unless you trust the model
- Use in containerized environments for untrusted workloads
- Consider running without
--bashby modifying the spawn arguments
License
MIT
