zarz
v0.5.1-alpha
Published
Fast AI coding assistant for terminal built with Rust
Downloads
6
Maintainers
Readme
ZarzCLI
ZarzCLI is a blazingly fast AI coding assistant for your terminal, built with Rust for maximum performance. It brings the power of Claude, GPT, and GLM models directly to your command line with intelligent context awareness and autonomous tool execution.
Features
Core Capabilities
- Interactive Chat - Real-time streaming responses with multiple AI models
- Multi-Provider Support - Claude (Anthropic), GPT (OpenAI), and GLM (Z.AI)
- Built-In Tools -
read_file,list_dir,grep_files, andapply_patchrun natively - File Operations - Direct file editing, creation, and management
- Smart Context - Automatic symbol search and relevant file detection
- MCP Support - Model Context Protocol integration for extended capabilities
- ChatGPT OAuth (Codex backend) -
/loginsigns into ChatGPT Plus/Pro, tokens auto-refresh, and GPT‑5 presets call the same Codex endpoint as OpenAI’s CLI - Auto Update - Automatic update checks and notifications for new versions
- Cross-Platform - Works seamlessly on Windows, Linux, and macOS
Intelligent Context Understanding
ZarzCLI v0.3.4+ includes autonomous bash tool execution, allowing AI models to:
- Search files:
find . -name "*.rs"orrg "pattern" - Read contents:
cat src/main.rsorhead -n 20 file.py - Grep code:
grep -r "function_name" src/ - Navigate structure:
ls -la src/ortree -L 2 - Check git:
git log --oneline -10orgit diff
User Experience
- Status Line - Shows current mode and notifications
- Double Ctrl+C - Confirmation before exit (prevents accidental exits)
- Colored Diff Display - Beautiful file change visualization with context
- Exploration Logs - File reads, directory listings, and searches are summarized concisely (no more full file dumps unless requested)
- Persistent Sessions - Resume previous conversations anytime
Built-In Tools
ZarzCLI now has tool set:
| Tool | Description |
|------|-------------|
| read_file | Reads files with optional line slices; stdout just shows a summary |
| list_dir | Returns file/dir counts with a short preview instead of dumping everything |
| grep_files | Greps inside a file (simple substring match) |
| apply_patch | Applies Zarz-style *** Begin Patch diffs directly on disk |
These tools run natively in Rust, so the terminal output is clean and the model still receives full context in the background.
No more artificial tool-call limits – ZarzCLI lets the agent keep digging until it’s satisfied. The old “Stopping tool execution after 3 calls” guardrails have been removed.
Installation
Via NPM (Recommended)
npm install -g zarzFrom Source
git clone https://github.com/zarzet/ZarzCLI.git
cd ZarzCLI
cargo build --releaseUpdating
ZarzCLI will automatically check for updates and notify you when a new version is available. To update manually:
npm update -g zarzQuick Start
First Run Setup
On first run, you'll be prompted to enter your API keys interactively:
zarzOr set manually via environment variables:
# For Anthropic Claude
export ANTHROPIC_API_KEY=sk-ant-...
# For OpenAI GPT
export OPENAI_API_KEY=sk-...
# For GLM (Z.AI)
export GLM_API_KEY=...
zarzYour API keys are securely stored in ~/.zarz/config.toml
Basic Usage
# Start interactive chat (default)
zarz
# Quick one-shot question
zarz --message "fix this bug"
# Use specific model
zarz --model claude-sonnet-4-5-20250929
# Manage configuration
zarz config --show # Show current config
zarz config --reset # Reconfigure API keys
zarz config --login-chatgpt # Sign in via ChatGPT OAuth to fetch an OpenAI keyChatGPT OAuth (Codex-compatible)
When you run zarz config --login-chatgpt or /login → “Sign in with ChatGPT”, ZarzCLI mirrors the official Codex CLI flow:
- OpenAI’s OAuth screen appears (PKCE +
originator=zarz_cli). - After you approve, ZarzCLI shows the “Signed in to ZarzCLI” success page.
- The CLI stores the returned
access_token,refresh_token,id_token, plusproject_id,organization_id, andchatgpt_account_idin~/.zarz/config.toml. - Before every run, ZarzCLI automatically refreshes the token if it’s near expiry and exports
All GPT‑5 presets then hit the ChatGPT Codex backend (OpenAI-Beta: responses=experimental, originator: codex_cli_rs) with the official Codex instructions so behavior matches OpenAI’s CLI exactly.
Available Commands
Once inside the interactive chat:
| Command | Description |
|---------|-------------|
| /help | Show all available commands |
| /apply | Apply pending file changes |
| /diff | Show pending changes with colored diff |
| /undo | Clear pending changes |
| /edit <file> | Load a file for editing |
| /search <symbol> | Search for a symbol in codebase |
| /context <query> | Find relevant files for a query |
| /files | List currently loaded files |
| /model <name> | Switch to a different AI model |
| /login | Open auth wizard (API keys or ChatGPT OAuth) |
| /mcp | Show MCP servers and available tools |
| /resume | Resume a previous chat session |
| /clear | Clear conversation history |
| /exit | Exit the session |
Supported AI Models
Anthropic Claude
Best for coding tasks and autonomous agents:
claude-sonnet-4-5-20250929(Latest, most capable)claude-haiku-4-5(Fast, cost-effective)claude-opus-4-1(Most powerful)
OpenAI GPT (ChatGPT OAuth)
Run zarz config --login-chatgpt to fetch an OpenAI key, then choose any of these GPT‑5 variants optimized for OAuth access:
gpt-5-codex– Default coding agentgpt-5-codex-low– Lower reasoning effortgpt-5-codex-medium– Balanced reasoning depthgpt-5-codex-high– High reasoning effort with detailed summariesgpt-5-minimal– Minimal reasoning, terse responsesgpt-5-low– Low-effort general GPT-5gpt-5-medium– Balanced GPT-5 experiencegpt-5-high– High reasoning-effort GPT-5gpt-5-mini– Lightweight GPT-5 for quick tasksgpt-5-nano– Fastest GPT-5 tier with minimal reasoning
When you run /model gpt-5-*, ZarzCLI now prompts you to pick a reasoning effort (Auto, Minimal, Low, Medium, High). The choice is saved to ~/.zarz/config.toml and applied to every Responses API call along with text.verbosity = "medium" and include = ["reasoning.encrypted_content"], matching the Codex OAuth defaults.
GLM (Z.AI)
Cost-effective coding with 200K context window:
glm-4.6($3/month subscription)- 200,000 token context window
- Specialized for coding tasks
See MODELS.md for full model list and GLM-PROVIDER.md for GLM setup.
Advanced Features
MCP (Model Context Protocol)
ZarzCLI supports MCP servers for extended capabilities. Configure in ~/.zarz/config.toml:
[[mcp_servers]]
name = "filesystem"
command = "npx"
args = ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/project"]Bash Tool Integration
AI models can automatically execute bash commands when they need context:
> Tell me about the authentication implementation
# AI automatically executes:
$ find . -name "*auth*" -type f
$ grep -r "authenticate" src/
$ cat src/auth/login.rs
# Then provides informed response based on actual codebaseAutomatic Updates
ZarzCLI automatically checks for updates on startup and notifies you when a new version is available. Updates are downloaded from npm registry and can be installed with a single command.
Requirements
- Node.js 14.0.0 or higher
- API Key from one of:
Note: Rust is NOT required for installation. Pre-built binaries are automatically downloaded for your platform (Windows, macOS, Linux).
References workspace
Need to keep upstream repos (Codex, plugins, docs) handy? Drop them into References/:
git clone https://github.com/openai/codex References/codex-mainThe folder is .gitignored, so you can mirror large sources locally without polluting commits. ZarzCLI’s Codex instructions and login success page were generated from those references.
Contributing
Contributions are welcome! ZarzCLI is now open source under MIT license.
Development Setup
For contributors who want to modify the source code:
Requirements:
- Node.js 14.0.0 or higher
- Rust toolchain (install from rustup.rs)
# Clone the repository
git clone https://github.com/zarzet/ZarzCLI.git
cd ZarzCLI
# Build the project
cargo build --release
# Run tests
cargo test
# Install locally for testing
npm install -g .Note: Regular users don't need Rust installed. Pre-built binaries are automatically downloaded during npm install.
Contribution Guidelines
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
Please ensure:
- Code compiles without warnings
- Tests pass
- Follow existing code style
- Update documentation as needed
License
MIT License - see LICENSE file for details.
Support
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- Author: @zarzet
Made with love by zarzet
