@zgsm/costrict-cli
v0.0.16
Published
> **Note**: CoStrict CLI is forked from > [Gemini CLI](https://github.com/google-gemini/gemini-cli) by Google, with > enhanced features and customizations.
Readme
CoStrict CLI
Note: CoStrict CLI is forked from Gemini CLI by Google, with enhanced features and customizations.

CoStrict CLI is an open-source AI agent that brings the power of AI models directly into your terminal. It provides lightweight access to various LLM providers including Gemini, OpenAI, Claude, and more, giving you the most direct path from your prompt to AI assistance.
🚀 Why CoStrict CLI?
- 🎯 Multi-Provider Support: Works with Gemini, OpenAI, Claude, local models, and more.
- 🔌 Custom LLM Support: Use any OpenAI-compatible API (GPT-4, Claude, local models, Chinese providers).
- 🧠 Powerful Models: Access to Gemini 2.5 Pro, GPT-4, Claude 3.5, and other cutting-edge models.
- 🔧 Built-in tools: File operations, shell commands, web fetching, and more.
- 🔌 Extensible: MCP (Model Context Protocol) support for custom integrations.
- 💻 Terminal-first: Designed for developers who live in the command line.
- 🛡️ Open source: Apache 2.0 licensed.
📦 Installation
Pre-requisites before installation
- Node.js version 20 or higher
- macOS, Linux, or Windows
Quick Install
Install globally with npm
npm install -g @zgsm/costrict-cliNote: The package includes native dependencies that may require compilation tools on some systems. If you encounter installation issues, please refer to our Installation Guide for platform-specific instructions.
Or build from source (see Local Development section).
Platform-Specific Notes
- Windows: May require Visual Studio Build Tools for native module compilation
- macOS: May require Xcode Command Line Tools
- Linux: May require build-essential and python3
For detailed troubleshooting and installation instructions, see INSTALL_GUIDE.md.
Release Notes
This fork maintains compatibility with the original Gemini CLI while adding enhanced features for custom LLM support and additional integrations.
📋 Key Features
Code Understanding & Generation
- Query and edit large codebases
- Generate new apps from PDFs, images, or sketches using multimodal capabilities
- Debug issues and troubleshoot with natural language
Automation & Integration
- Automate operational tasks like querying pull requests or handling complex rebases
- Use MCP servers to connect new capabilities, including media generation with Imagen, Veo or Lyria
- Run non-interactively in scripts for workflow automation
Advanced Capabilities
- Ground your queries with built-in Google Search for real-time information
- Conversation checkpointing to save and resume complex sessions
- Custom context files (COSTRICT.md) to tailor behavior for your projects
GitHub Integration
Integrate CoStrict CLI directly into your GitHub workflows with Gemini CLI GitHub Action:
- Pull Request Reviews: Automated code review with contextual feedback and suggestions
- Issue Triage: Automated labeling and prioritization of GitHub issues based on content analysis
- On-demand Assistance: Mention
@costrict-cliin issues and pull requests for help with debugging, explanations, or task delegation - Custom Workflows: Build automated, scheduled and on-demand workflows tailored to your team's needs
🔐 Authentication Options
Choose the authentication method that best fits your needs:
Option 1: Custom LLM (OpenAI Protocol)
✨ Best for: Using alternative LLM providers or self-hosted models
Benefits:
- Flexibility: Use any OpenAI-compatible API (GPT-4, Claude via OpenRouter, local models, etc.)
- Cost control: Choose providers based on pricing
- Privacy: Run models locally or on your own infrastructure
- Provider choice: Access to hundreds of models through services like OpenRouter
# Example: Using OpenAI GPT-4
export USE_CUSTOM_LLM=true
export CUSTOM_LLM_API_KEY="sk-..."
export CUSTOM_LLM_ENDPOINT="https://api.openai.com/v1"
export CUSTOM_LLM_MODEL_NAME="gpt-4"
costrict-cliSupported providers:
- OpenAI, Azure OpenAI, OpenRouter
- Self-hosted: Ollama, vLLM, text-generation-webui
- Chinese providers: Doubao, Qwen, Kimi, DeepSeek, etc.
See the Custom LLM Integration Guide for detailed setup instructions.
Option 2: Login with Google (OAuth login using your Google Account)
✨ Best for: Individual developers as well as anyone who has a Gemini Code Assist License. (see quota limits and terms of service for details)
Benefits:
- Free tier: 60 requests/min and 1,000 requests/day
- Gemini 2.5 Pro with 1M token context window
- No API key management - just sign in with your Google account
- Automatic updates to latest models
Start CoStrict CLI, then choose Login with Google and follow the browser authentication flow when prompted
costrict-cliIf you are using a paid Code Assist License from your organization, remember to set the Google Cloud Project
# Set your Google Cloud Project
export GOOGLE_CLOUD_PROJECT="YOUR_PROJECT_ID"
costrict-cliOption 3: Gemini API Key
✨ Best for: Developers who need specific model control or paid tier access
Benefits:
- Free tier: 100 requests/day with Gemini 2.5 Pro
- Model selection: Choose specific Gemini models
- Usage-based billing: Upgrade for higher limits when needed
# Get your key from https://aistudio.google.com/apikey
export COSTRICT_API_KEY="YOUR_API_KEY"
costrict-cliOption 4: Vertex AI
✨ Best for: Enterprise teams and production workloads
Benefits:
- Enterprise features: Advanced security and compliance
- Scalable: Higher rate limits with billing account
- Integration: Works with existing Google Cloud infrastructure
# Get your key from Google Cloud Console
export GOOGLE_API_KEY="YOUR_API_KEY"
export GOOGLE_GENAI_USE_VERTEXAI=true
costrict-cliConfiguration Details
All authentication credentials and settings are stored in:
~/.costrict/settings.jsonYou can also use environment variables for configuration, which take precedence over the settings file.
For Google Workspace accounts and other authentication methods, see the authentication guide.
🔌 Using Custom/Third-Party Models
CoStrict CLI supports any OpenAI-compatible API, allowing you to use alternative LLM providers or self-hosted models.
Quick Setup
Set these environment variables before running costrict-cli:
export USE_CUSTOM_LLM=true
export CUSTOM_LLM_API_KEY="your-api-key"
export CUSTOM_LLM_ENDPOINT="https://api.provider.com/v1"
export CUSTOM_LLM_MODEL_NAME="model-name"Provider Examples
OpenAI GPT-4
export USE_CUSTOM_LLM=true
export CUSTOM_LLM_API_KEY="sk-..."
export CUSTOM_LLM_ENDPOINT="https://api.openai.com/v1"
export CUSTOM_LLM_MODEL_NAME="gpt-4"
costrict-cliClaude via OpenRouter
export USE_CUSTOM_LLM=true
export CUSTOM_LLM_API_KEY="sk-or-v1-..."
export CUSTOM_LLM_ENDPOINT="https://openrouter.ai/api/v1"
export CUSTOM_LLM_MODEL_NAME="anthropic/claude-3.5-sonnet"
costrict-cliOllama (Local)
# Start Ollama first: ollama run llama3
export USE_CUSTOM_LLM=true
export CUSTOM_LLM_API_KEY="ollama" # Can be any string
export CUSTOM_LLM_ENDPOINT="http://localhost:11434/v1"
export CUSTOM_LLM_MODEL_NAME="llama3"
costrict-cliChinese Providers
Zhipu AI (GLM)
export USE_CUSTOM_LLM=true
export CUSTOM_LLM_API_KEY="your-api-key"
export CUSTOM_LLM_ENDPOINT="https://open.bigmodel.cn/api/paas/v4"
export CUSTOM_LLM_MODEL_NAME="GLM-4-Flash"
costrict-cliDoubao (ByteDance)
export USE_CUSTOM_LLM=true
export CUSTOM_LLM_API_KEY="your-api-key"
export CUSTOM_LLM_ENDPOINT="https://ark.cn-beijing.volces.com/api/v3"
export CUSTOM_LLM_MODEL_NAME="doubao-pro-32k"
costrict-cliQwen (Alibaba)
export USE_CUSTOM_LLM=true
export CUSTOM_LLM_API_KEY="sk-..."
export CUSTOM_LLM_ENDPOINT="https://dashscope.aliyuncs.com/compatible-mode/v1"
export CUSTOM_LLM_MODEL_NAME="qwen-plus"
costrict-cliAdvanced Configuration
Optional Parameters
# Set temperature (0.0-2.0)
export CUSTOM_LLM_TEMPERATURE="0.7"
# Set maximum tokens
export CUSTOM_LLM_MAX_TOKENS="4096"
# Set top-p sampling
export CUSTOM_LLM_TOP_P="0.95"Persistent Configuration
Save settings to ~/.costrict/settings.json:
{
"useCustomLLM": true,
"customLLM": {
"apiKey": "your-api-key",
"endpoint": "https://api.provider.com/v1",
"modelName": "gpt-4",
"temperature": 0.7,
"maxTokens": 4096
}
}Note: Environment variables take precedence over settings in
settings.json.
Troubleshooting Custom Models
If you encounter issues:
- Check endpoint format: Must end with
/v1for OpenAI compatibility - Verify API key: Ensure it's valid and has correct permissions
- Test connection: Use
curlto verify the endpoint works - Enable debug logging: Set
COSTRICT_DEBUG_LOG_FILEenvironment variable
export COSTRICT_DEBUG_LOG_FILE=/tmp/costrict-debug.log
costrict-cliSee Custom LLM Integration Guide for more details.
🚀 Getting Started
Basic Usage
Start in current directory
costrict-cliInclude multiple directories
costrict-cli --include-directories ../lib,../docsUse specific model
costrict-cli -m gemini-2.5-flashNon-interactive mode for scripts
Get a simple text response:
costrict-cli -p "Explain the architecture of this codebase"For more advanced scripting, including how to parse JSON and handle errors, use
the --output-format json flag to get structured output:
costrict-cli -p "Explain the architecture of this codebase" --output-format jsonFor real-time event streaming (useful for monitoring long-running operations),
use --output-format stream-json to get newline-delimited JSON events:
costrict-cli -p "Run tests and deploy" --output-format stream-jsonQuick Examples
Start a new project
cd new-project/
costrict-cli
> Write me a Discord bot that answers questions using a FAQ.md file I will provideAnalyze existing code
git clone https://github.com/zgsm-ai/costrict-cli
cd costrict-cli
costrict-cli
> Give me a summary of all of the changes that went in yesterday📚 Documentation
Getting Started
- Quickstart Guide - Get up and running quickly.
- Authentication Setup - Detailed auth configuration.
- Configuration Guide - Settings and customization.
- Custom LLM Integration - Use third-party models.
- Keyboard Shortcuts - Productivity tips.
Core Features
- Commands Reference - All slash commands
(
/help,/chat, etc). - Custom Commands - Create your own reusable commands.
- Context Files (COSTRICT.md) - Provide persistent context to CoStrict CLI.
- Checkpointing - Save and resume conversations.
- Token Caching - Optimize token usage.
Tools & Extensions
- Built-in Tools Overview
- MCP Server Integration - Extend with custom tools.
- Custom Extensions - Build and share your own commands.
Advanced Topics
- Headless Mode (Scripting) - Use CoStrict CLI in automated workflows.
- Architecture Overview - How CoStrict CLI works.
- IDE Integration - VS Code companion.
- Sandboxing & Security - Safe execution environments.
- Trusted Folders - Control execution policies by folder.
- Enterprise Guide - Deploy and manage in a corporate environment.
- Telemetry & Monitoring - Usage tracking.
- Tools API Development - Create custom tools.
- Local development - Local development tooling.
Troubleshooting & Support
- Troubleshooting Guide - Common issues and solutions.
- FAQ - Frequently asked questions.
- Use
/bugcommand to report issues directly from the CLI.
Using MCP Servers
Configure MCP servers in ~/.costrict/settings.json to extend CoStrict CLI with
custom tools:
> @github List my open pull requests
> @slack Send a summary of today's commits to #dev channel
> @database Run a query to find inactive usersSee the MCP Server Integration guide for setup instructions.
🛠️ Local Development
Setting Up Development Environment
Prerequisites
- Node.js 20.19.2 or higher (check with
node --version) - npm package manager
- Git
Clone and Install
# Clone the repository
git clone https://github.com/zgsm-ai/costrict-cli.git
cd costrict-cli
# Install dependencies
npm install
# Build the project
npm run buildDevelopment Workflow
Build and Link for Local Testing
To test your changes locally without publishing:
# Build the project
npm run build
# Link globally (makes 'costrict-cli' command use your local version)
npm link
# Now run costrict CLI with your changes
costrict-cliAfter linking, any costrict-cli command will use your local development
version.
Watch Mode for Development
For continuous development with automatic rebuilds:
# Start watch mode (rebuilds on file changes)
npm run build:watch
# In another terminal, test your changes
costrict-cliUnlink Development Version
When you're done developing and want to use the published version:
# Unlink local version
npm unlink -g costrict-cli
# Install published version
npm install -g costrict-cliProject Structure
costrict-cli/
├── packages/
│ ├── cli/ # CLI application (entry point)
│ │ ├── src/
│ │ │ └── gemini.tsx # Main CLI component
│ │ └── package.json
│ └── core/ # Core functionality
│ ├── src/
│ │ ├── custom_llm/ # Custom LLM integration
│ │ ├── tools/ # Built-in tools
│ │ └── types/ # TypeScript types
│ └── package.json
├── docs/ # Documentation
├── tests/ # Test files
└── package.json # Root package.jsonRunning Tests
# Run all tests
npm test
# Run tests in watch mode
npm run test:watch
# Run specific test file
npm test -- converter.test.ts
# Run tests with coverage
npm run test:coverageTesting Custom LLM Changes
When developing custom LLM integration:
# Set up environment variables
export USE_CUSTOM_LLM=true
export CUSTOM_LLM_API_KEY="your-test-key"
export CUSTOM_LLM_ENDPOINT="http://localhost:11434/v1"
export CUSTOM_LLM_MODEL_NAME="test-model"
# Enable debug logging
export COSTRICT_DEBUG_LOG_FILE=/tmp/costrict-dev.log
# Run your local build
npm run build && costrict-cli
# Check debug logs
tail -f /tmp/costrict-dev.logCommon Development Tasks
Adding a New Feature
- Create a feature branch
git checkout -b feature/my-new-featureMake your changes in appropriate package (
packages/cliorpackages/core)Build and test
npm run build
npm test
npm link # Test locally
costrict-cli # Try your feature- Commit and push
git add .
git commit -m "feat: add new feature"
git push origin feature/my-new-featureDebugging Tips
Enable verbose logging:
export COSTRICT_DEBUG_LOG_FILE=/tmp/costrict-debug.log
export DEBUG=*
costrict-cliUse Node.js debugger:
# Add debugger statement in your code
node --inspect-brk $(which costrict-cli)Check TypeScript types:
npm run check-typesCode Quality
# Run linter
npm run lint
# Auto-fix linting issues
npm run lint:fix
# Format code
npm run formatConfiguration File Locations
During development, CoStrict CLI uses these locations:
- Settings:
~/.costrict/settings.json - Cache:
~/.costrict/cache/ - Checkpoints:
~/.costrict/checkpoints/ - Logs: Location specified by
COSTRICT_DEBUG_LOG_FILEenvironment variable
You can modify settings.json to configure your development instance:
{
"useCustomLLM": true,
"customLLM": {
"endpoint": "http://localhost:11434/v1",
"modelName": "llama3",
"apiKey": "dev-key"
},
"telemetry": false
}Helpful Development Commands
# Clean build artifacts
npm run clean
# Full clean and rebuild
npm run clean && npm install && npm run build
# Check for outdated dependencies
npm outdated
# Update dependencies
npm update
# Publish to npm (maintainers only)
npm publishWorking with Monorepo Packages
This project uses a monorepo structure with multiple packages:
# Install dependencies for all packages
npm install
# Build specific package
npm run build --workspace=packages/core
# Run command in specific package
npm run test --workspace=packages/cliTips for Contributing
- Always build before testing:
npm run build - Use npm link for local testing: Faster than reinstalling
- Enable debug logging: Helps track down issues
- Test with multiple providers: Custom LLM, Gemini API, OAuth
- Check TypeScript types:
npm run check-typesbefore committing - Write tests: Add tests for new features in
tests/
See the Local Development Guide for more advanced topics.
Package Publishing (For Maintainers)
Step 1: Prepare the Release Package
This will automatically execute the build and precompilation process:
npm run prepare:packageWhat it does:
- Copy files to subpackages (core, cli)
- Run
npm run build --workspaces- Build all subpackages first to ensure latest versions - Run
npm run bundle- Build main package (referencing the latest subpackages) - Run
npm run prebuild- Prepare precompiled binary files
Step 2: Verify Package Contents
# Test packaging (won't create actual files)
npm pack --dry-run
# Actually package
npm packCheck the generated .tgz file:
tar -tzf zgsm-costrict-cli-0.0.2.tgz | grep -E "(bundle|prebuilds|README|LICENSE)"You should see:
package/bundle/gemini.js- Main programpackage/prebuilds/darwin-arm64/- macOS ARM precompiled filespackage/prebuilds/darwin-x64/- macOS Intel precompiled filespackage/prebuilds/linux-x64/- Linux precompiled filespackage/prebuilds/win32-x64/- Windows precompiled filespackage/README.md- Documentationpackage/LICENSE- License
Step 3: Publish to npm
cd costrict-cli
# Login to npm (if not already logged in)
npm whoami || npm login
# Simulate publishing (won't actually publish)
npm publish --access public --dry-run
# After confirming everything is correct, publish for real
npm publish --access publicNotes:
- Scoped packages (
@zgsm/xxx) require--access publicto be publicly published - Ensure you have publishing permissions for the
@zgsmscope
Subpackage Dependency Management
Problem Description
In the monorepo structure:
- The main package (root directory's
package.json) is bundled intobundle/gemini.jsvia esbuild - During bundling, it references subpackages like
@zgsm/costrict-cli-core,@zgsm/costrict-cli, etc. - Must ensure that the bundling references the latest built subpackage code
Solution
prepare-package.js is configured to execute in the following order:
// 1. First build all subpackages
npm run build --workspaces
// 2. Then build the main package (now referencing the latest subpackages)
npm run bundle
// 3. Finally prepare precompiled files
npm run prebuildVerification Method
Check that the build order is correct:
npm run prepare:package 2>&1 | grep "Building"
# Should see output indicating:
# 1️⃣ Building all subpackages...
# ✓ Subpackages built
# 2️⃣ Building main package...
# ✓ Main package built
# 3️⃣ Preparing precompiled binaries...Manual Verification
If you need to manually verify that subpackages are up-to-date:
# 1. Check subpackage build time
ls -la packages/core/dist/
ls -la packages/cli/dist/
# 2. Check main package build time
ls -la bundle/
# bundle/ modification time should be later than packages/*/dist/🤝 Contributing
We welcome contributions! CoStrict CLI is fully open source (Apache 2.0), and we encourage the community to:
- Report bugs and suggest features.
- Improve documentation.
- Submit code improvements.
- Share your MCP servers and extensions.
See our Contributing Guide for development setup, coding standards, and how to submit pull requests.
📖 Resources
- Official Roadmap - See what's coming next.
- Changelog - See recent notable updates.
- NPM Package - Package registry.
- GitHub Repository - Source code and issues.
Uninstall
See the Uninstall Guide for removal instructions.
📂 Configuration File Locations
CoStrict CLI stores its configuration and data in the following locations:
| Type | Location | Description |
| ------------------- | --------------------------------- | ------------------------------------------------------- |
| Settings | ~/.costrict/settings.json | Main configuration file (auth, custom LLM, preferences) |
| Cache | ~/.costrict/cache/ | Token cache and temporary data |
| Checkpoints | ~/.costrict/checkpoints/ | Saved conversation sessions |
| MCP Config | ~/.costrict/settings.json | MCP server configurations (in settings file) |
| Debug Logs | Set via COSTRICT_DEBUG_LOG_FILE | Development and troubleshooting logs |
| Project Context | ./COSTRICT.md | Per-project instructions (in your project root) |
Example settings.json
{
"authMethod": "oauth",
"useCustomLLM": false,
"customLLM": {
"apiKey": "",
"endpoint": "https://api.openai.com/v1",
"modelName": "gpt-4"
},
"telemetry": true,
"mcpServers": {
"github": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-github"]
}
}
}Note: Environment variables (e.g., USE_CUSTOM_LLM, CUSTOM_LLM_API_KEY)
take precedence over values in settings.json.
📄 Legal
- License: Apache License 2.0
- Terms of Service: Terms & Privacy
- Security: Security Policy
