@wren-coder/wren-coder-cli
v0.0.1-alpha.13
Published
[](https://badge.fury.io/js/@wren-coder%2Fwren-coder-cli) [](https://www.npmjs.com/package/@wren-coder/w
Downloads
20
Readme
Wren Coder

Wren Coder is a model-agnostic command-line AI workflow tool forked from Qwen CLI Coder, which was itself a fork of Gemini CLI (Please refer to this document for more details). Currently supports OpenAI-compatible APIs, with plans to expand support to non-OpenAI-compatible models in the future.
[!WARNING] Wren Coder may issue multiple API calls per cycle, resulting in higher token usage, similar to Claude Code. We’re actively working to enhance API efficiency and improve the overall developer experience.
Key Features
- Model Agnostic - Currently supports OpenAI-compatible APIs with plans for broader model support
- Code Understanding & Editing - Query and edit large codebases beyond traditional context window limits
- Workflow Automation - Automate operational tasks like handling pull requests and complex rebases
- Enhanced Tool Support - Comprehensive tool ecosystem for file operations, shell commands, and web integration
Quick Start
Prerequisites
Ensure you have Node.js version 20 or higher installed.
curl -qL https://www.npmjs.com/install.sh | shInstallation
npm install -g @wren-coder/wren-coder-cli
wren --versionThen run from anywhere:
wrenOr you can install it from source:
git clone https://github.com/wren-coder/wren-coder-cli.git
cd wren-coder
npm install
npm install -g .API Configuration
Wren Coder currently works with OpenAI-compatible APIs. Configure your API settings using environment variables or a .env file in your project root.
export OPENAI_API_KEY="your_api_key_here"
export OPENAI_BASE_URL="your_api_base_url_here" # e.g., https://api.openai.com/v1
export OPENAI_MODEL="your_model_name_here" # e.g., gpt-4, deepseek-coder, etc.Currently Supported (OpenAI-Compatible):
- OpenAI (GPT-4, GPT-3.5)
- DeepSeek (deepseek-coder, deepseek-chat)
- Anthropic (via OpenAI-compatible proxy)
- Local models (Ollama, vLLM, etc.)
- Any OpenAI-compatible API endpoint
Future Support Planned:
- Native Ollama integration
- Direct Anthropic API support
- Additional model providers and protocols
See our ROADMAP.md for detailed plans on expanding model support.
Usage Examples
Explore Codebases
cd your-project/
wren
> Describe the main pieces of this system's architectureCode Development
> Refactor this function to improve readability and performanceAutomate Workflows
> Analyze git commits from the last 7 days, grouped by feature and team member> Convert all images in this directory to PNG formatPopular Tasks
Understand New Codebases
> What are the core business logic components?
> What security mechanisms are in place?
> How does the data flow work?Code Refactoring & Optimization
> What parts of this module can be optimized?
> Help me refactor this class to follow better design patterns
> Add proper error handling and loggingDocumentation & Testing
> Generate comprehensive JSDoc comments for this function
> Write unit tests for this component
> Create API documentationBenchmark Results
Terminal-Bench
| Agent | Model | Accuracy | | ----- | ----- | -------- |
Project Structure
wren-coder/
├── packages/ # Core packages
│ ├── cli/ # Command-line interface
│ ├── core/ # Core functionality
│ └── vscode-ide-companion/ # VS Code extension
├── docs/ # Documentation
├── examples/ # Example code
└── tests/ # Test filesFor detailed development plans and upcoming features, see our ROADMAP.md.
Development & Contributing
See CONTRIBUTING.md to learn how to contribute to the project.
Troubleshooting
If you encounter issues, check the troubleshooting guide.
Acknowledgments
This project is a fork of Qwen CLI Coder, which was originally forked from Google Gemini CLI. We acknowledge and appreciate the excellent work of both the Gemini CLI team and the Qwen team. Our main contribution focuses on creating a model-agnostic solution with enhanced tool support and improved compatibility across different AI providers.
