@vnet/vnet-cli
v0.0.1
Published

Readme
Vnet CLI

Vnet CLI is a command-line AI workflow tool adapted from Gemini CLI (Please refer to this document for more details), optimized for multiple AI models with enhanced parser support & tool support.
[!WARNING] Vnet CLI may issue multiple API calls per cycle, resulting in higher token usage, similar to Claude Code. We’re actively working to enhance API efficiency and improve the overall developer experience. ModelScope offers 2,000 free API calls if you are in China mainland. Please check API config section for more details.
Key Features
- Code Understanding & Editing - Query and edit large codebases beyond traditional context window limits
- Workflow Automation - Automate operational tasks like handling pull requests and complex rebases
- Enhanced Parser - Adapted parser specifically optimized for Vnet CLI models
Quick Start
Prerequisites
Ensure you have Node.js version 20 or higher installed.
curl -qL https://www.npmjs.com/install.sh | shInstallation
npm install -g @vnet/vnet-cli
vnet --versionThen run from anywhere:
vnetOr you can install it from source:
git clone https://github.com/litianc/vnet-cli.git
cd vnet-cli
npm install
npm install -g .API Configuration
Set your Qwen API key (In Vnet CLI project, you can also set your API key in .env file). the .env file should be placed in the root directory of your current project.
⚠️ Notice: If you are in mainland China, please go to https://bailian.console.aliyun.com/ or https://modelscope.cn/docs/model-service/API-Inference/intro to apply for your API key If you are not in mainland China, please go to https://modelstudio.console.alibabacloud.com/ to apply for your API key
If you are in mainland China, you can use Qwen3-Coder through the Alibaba Cloud bailian platform.
export OPENAI_API_KEY="your_api_key_here"
export OPENAI_BASE_URL="https://dashscope.aliyuncs.com/compatible-mode/v1"
export OPENAI_MODEL="qwen3-coder-plus"If you are in mainland China, ModelScope offers 2,000 free model inference API calls per day:
export OPENAI_API_KEY="your_api_key_here"
export OPENAI_BASE_URL="https://api-inference.modelscope.cn/v1"
export OPENAI_MODEL="Qwen/Qwen3-Coder-480B-A35B-Instruct"If you are not in mainland China, you can use Qwen3-Coder through the Alibaba Cloud modelstuido platform.
export OPENAI_API_KEY="your_api_key_here"
export OPENAI_BASE_URL="https://dashscope-intl.aliyuncs.com/compatible-mode/v1"
export OPENAI_MODEL="qwen3-coder-plus"Usage Examples
Explore Codebases
cd your-project/
vnet
> Describe the main pieces of this system's architectureCode Development
> Refactor this function to improve readability and performanceAutomate Workflows
> Analyze git commits from the last 7 days, grouped by feature and team member> Convert all images in this directory to PNG formatPopular Tasks
Understand New Codebases
> What are the core business logic components?
> What security mechanisms are in place?
> How does the data flow work?Code Refactoring & Optimization
> What parts of this module can be optimized?
> Help me refactor this class to follow better design patterns
> Add proper error handling and loggingDocumentation & Testing
> Generate comprehensive JSDoc comments for this function
> Write unit tests for this component
> Create API documentationBenchmark Results
Terminal-Bench
| Agent | Model | Accuracy | | --------- | ------------------ | -------- | | Vnet CLI | Qwen3-Coder-480A35 | 37.5 |
Project Structure
vnet-cli/
├── packages/ # Core packages
├── docs/ # Documentation
├── examples/ # Example code
└── tests/ # Test filesDevelopment & Contributing
See CONTRIBUTING.md to learn how to contribute to the project.
Troubleshooting
If you encounter issues, check the troubleshooting guide.
Acknowledgments
This project is based on Google Gemini CLI. We acknowledge and appreciate the excellent work of the Gemini CLI team. Our main contribution focuses on parser-level adaptations to better support Vnet CLI models.
