npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

zarz

v0.5.1-alpha

Published

Fast AI coding assistant for terminal built with Rust

Downloads

6

Readme

ZarzCLI

npm version npm downloads GitHub release License: MIT

ZarzCLI is a blazingly fast AI coding assistant for your terminal, built with Rust for maximum performance. It brings the power of Claude, GPT, and GLM models directly to your command line with intelligent context awareness and autonomous tool execution.

Features

Core Capabilities

  • Interactive Chat - Real-time streaming responses with multiple AI models
  • Multi-Provider Support - Claude (Anthropic), GPT (OpenAI), and GLM (Z.AI)
  • Built-In Tools - read_file, list_dir, grep_files, and apply_patch run natively
  • File Operations - Direct file editing, creation, and management
  • Smart Context - Automatic symbol search and relevant file detection
  • MCP Support - Model Context Protocol integration for extended capabilities
  • ChatGPT OAuth (Codex backend) - /login signs into ChatGPT Plus/Pro, tokens auto-refresh, and GPT‑5 presets call the same Codex endpoint as OpenAI’s CLI
  • Auto Update - Automatic update checks and notifications for new versions
  • Cross-Platform - Works seamlessly on Windows, Linux, and macOS

Intelligent Context Understanding

ZarzCLI v0.3.4+ includes autonomous bash tool execution, allowing AI models to:

  • Search files: find . -name "*.rs" or rg "pattern"
  • Read contents: cat src/main.rs or head -n 20 file.py
  • Grep code: grep -r "function_name" src/
  • Navigate structure: ls -la src/ or tree -L 2
  • Check git: git log --oneline -10 or git diff

User Experience

  • Status Line - Shows current mode and notifications
  • Double Ctrl+C - Confirmation before exit (prevents accidental exits)
  • Colored Diff Display - Beautiful file change visualization with context
  • Exploration Logs - File reads, directory listings, and searches are summarized concisely (no more full file dumps unless requested)
  • Persistent Sessions - Resume previous conversations anytime

Built-In Tools

ZarzCLI now has tool set:

| Tool | Description | |------|-------------| | read_file | Reads files with optional line slices; stdout just shows a summary | | list_dir | Returns file/dir counts with a short preview instead of dumping everything | | grep_files | Greps inside a file (simple substring match) | | apply_patch | Applies Zarz-style *** Begin Patch diffs directly on disk |

These tools run natively in Rust, so the terminal output is clean and the model still receives full context in the background.

No more artificial tool-call limits – ZarzCLI lets the agent keep digging until it’s satisfied. The old “Stopping tool execution after 3 calls” guardrails have been removed.

Installation

Via NPM (Recommended)

npm install -g zarz

From Source

git clone https://github.com/zarzet/ZarzCLI.git
cd ZarzCLI
cargo build --release

Updating

ZarzCLI will automatically check for updates and notify you when a new version is available. To update manually:

npm update -g zarz

Quick Start

First Run Setup

On first run, you'll be prompted to enter your API keys interactively:

zarz

Or set manually via environment variables:

# For Anthropic Claude
export ANTHROPIC_API_KEY=sk-ant-...

# For OpenAI GPT
export OPENAI_API_KEY=sk-...

# For GLM (Z.AI)
export GLM_API_KEY=...

zarz

Your API keys are securely stored in ~/.zarz/config.toml

Basic Usage

# Start interactive chat (default)
zarz

# Quick one-shot question
zarz --message "fix this bug"

# Use specific model
zarz --model claude-sonnet-4-5-20250929

# Manage configuration
zarz config --show     # Show current config
zarz config --reset    # Reconfigure API keys
zarz config --login-chatgpt  # Sign in via ChatGPT OAuth to fetch an OpenAI key

ChatGPT OAuth (Codex-compatible)

When you run zarz config --login-chatgpt or /login → “Sign in with ChatGPT”, ZarzCLI mirrors the official Codex CLI flow:

  1. OpenAI’s OAuth screen appears (PKCE + originator=zarz_cli).
  2. After you approve, ZarzCLI shows the “Signed in to ZarzCLI” success page.
  3. The CLI stores the returned access_token, refresh_token, id_token, plus project_id, organization_id, and chatgpt_account_id in ~/.zarz/config.toml.
  4. Before every run, ZarzCLI automatically refreshes the token if it’s near expiry and exports

All GPT‑5 presets then hit the ChatGPT Codex backend (OpenAI-Beta: responses=experimental, originator: codex_cli_rs) with the official Codex instructions so behavior matches OpenAI’s CLI exactly.

Available Commands

Once inside the interactive chat:

| Command | Description | |---------|-------------| | /help | Show all available commands | | /apply | Apply pending file changes | | /diff | Show pending changes with colored diff | | /undo | Clear pending changes | | /edit <file> | Load a file for editing | | /search <symbol> | Search for a symbol in codebase | | /context <query> | Find relevant files for a query | | /files | List currently loaded files | | /model <name> | Switch to a different AI model | | /login | Open auth wizard (API keys or ChatGPT OAuth) | | /mcp | Show MCP servers and available tools | | /resume | Resume a previous chat session | | /clear | Clear conversation history | | /exit | Exit the session |

Supported AI Models

Anthropic Claude

Best for coding tasks and autonomous agents:

  • claude-sonnet-4-5-20250929 (Latest, most capable)
  • claude-haiku-4-5 (Fast, cost-effective)
  • claude-opus-4-1 (Most powerful)

OpenAI GPT (ChatGPT OAuth)

Run zarz config --login-chatgpt to fetch an OpenAI key, then choose any of these GPT‑5 variants optimized for OAuth access:

  • gpt-5-codex – Default coding agent
  • gpt-5-codex-low – Lower reasoning effort
  • gpt-5-codex-medium – Balanced reasoning depth
  • gpt-5-codex-high – High reasoning effort with detailed summaries
  • gpt-5-minimal – Minimal reasoning, terse responses
  • gpt-5-low – Low-effort general GPT-5
  • gpt-5-medium – Balanced GPT-5 experience
  • gpt-5-high – High reasoning-effort GPT-5
  • gpt-5-mini – Lightweight GPT-5 for quick tasks
  • gpt-5-nano – Fastest GPT-5 tier with minimal reasoning

When you run /model gpt-5-*, ZarzCLI now prompts you to pick a reasoning effort (Auto, Minimal, Low, Medium, High). The choice is saved to ~/.zarz/config.toml and applied to every Responses API call along with text.verbosity = "medium" and include = ["reasoning.encrypted_content"], matching the Codex OAuth defaults.

GLM (Z.AI)

Cost-effective coding with 200K context window:

  • glm-4.6 ($3/month subscription)
  • 200,000 token context window
  • Specialized for coding tasks

See MODELS.md for full model list and GLM-PROVIDER.md for GLM setup.

Advanced Features

MCP (Model Context Protocol)

ZarzCLI supports MCP servers for extended capabilities. Configure in ~/.zarz/config.toml:

[[mcp_servers]]
name = "filesystem"
command = "npx"
args = ["-y", "@modelcontextprotocol/server-filesystem", "/path/to/project"]

Bash Tool Integration

AI models can automatically execute bash commands when they need context:

> Tell me about the authentication implementation

# AI automatically executes:
$ find . -name "*auth*" -type f
$ grep -r "authenticate" src/
$ cat src/auth/login.rs

# Then provides informed response based on actual codebase

Automatic Updates

ZarzCLI automatically checks for updates on startup and notifies you when a new version is available. Updates are downloaded from npm registry and can be installed with a single command.

Requirements

Note: Rust is NOT required for installation. Pre-built binaries are automatically downloaded for your platform (Windows, macOS, Linux).

References workspace

Need to keep upstream repos (Codex, plugins, docs) handy? Drop them into References/:

git clone https://github.com/openai/codex References/codex-main

The folder is .gitignored, so you can mirror large sources locally without polluting commits. ZarzCLI’s Codex instructions and login success page were generated from those references.

Contributing

Contributions are welcome! ZarzCLI is now open source under MIT license.

Development Setup

For contributors who want to modify the source code:

Requirements:

# Clone the repository
git clone https://github.com/zarzet/ZarzCLI.git
cd ZarzCLI

# Build the project
cargo build --release

# Run tests
cargo test

# Install locally for testing
npm install -g .

Note: Regular users don't need Rust installed. Pre-built binaries are automatically downloaded during npm install.

Contribution Guidelines

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

Please ensure:

  • Code compiles without warnings
  • Tests pass
  • Follow existing code style
  • Update documentation as needed

License

MIT License - see LICENSE file for details.

Support


Made with love by zarzet