npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@zgsm/costrict-cli

v0.0.16

Published

> **Note**: CoStrict CLI is forked from > [Gemini CLI](https://github.com/google-gemini/gemini-cli) by Google, with > enhanced features and customizations.

Readme

CoStrict CLI

Note: CoStrict CLI is forked from Gemini CLI by Google, with enhanced features and customizations.

CoStrict CLI Screenshot

CoStrict CLI is an open-source AI agent that brings the power of AI models directly into your terminal. It provides lightweight access to various LLM providers including Gemini, OpenAI, Claude, and more, giving you the most direct path from your prompt to AI assistance.

🚀 Why CoStrict CLI?

  • 🎯 Multi-Provider Support: Works with Gemini, OpenAI, Claude, local models, and more.
  • 🔌 Custom LLM Support: Use any OpenAI-compatible API (GPT-4, Claude, local models, Chinese providers).
  • 🧠 Powerful Models: Access to Gemini 2.5 Pro, GPT-4, Claude 3.5, and other cutting-edge models.
  • 🔧 Built-in tools: File operations, shell commands, web fetching, and more.
  • 🔌 Extensible: MCP (Model Context Protocol) support for custom integrations.
  • 💻 Terminal-first: Designed for developers who live in the command line.
  • 🛡️ Open source: Apache 2.0 licensed.

📦 Installation

Pre-requisites before installation

  • Node.js version 20 or higher
  • macOS, Linux, or Windows

Quick Install

Install globally with npm

npm install -g @zgsm/costrict-cli

Note: The package includes native dependencies that may require compilation tools on some systems. If you encounter installation issues, please refer to our Installation Guide for platform-specific instructions.

Or build from source (see Local Development section).

Platform-Specific Notes

  • Windows: May require Visual Studio Build Tools for native module compilation
  • macOS: May require Xcode Command Line Tools
  • Linux: May require build-essential and python3

For detailed troubleshooting and installation instructions, see INSTALL_GUIDE.md.

Release Notes

This fork maintains compatibility with the original Gemini CLI while adding enhanced features for custom LLM support and additional integrations.

📋 Key Features

Code Understanding & Generation

  • Query and edit large codebases
  • Generate new apps from PDFs, images, or sketches using multimodal capabilities
  • Debug issues and troubleshoot with natural language

Automation & Integration

  • Automate operational tasks like querying pull requests or handling complex rebases
  • Use MCP servers to connect new capabilities, including media generation with Imagen, Veo or Lyria
  • Run non-interactively in scripts for workflow automation

Advanced Capabilities

  • Ground your queries with built-in Google Search for real-time information
  • Conversation checkpointing to save and resume complex sessions
  • Custom context files (COSTRICT.md) to tailor behavior for your projects

GitHub Integration

Integrate CoStrict CLI directly into your GitHub workflows with Gemini CLI GitHub Action:

  • Pull Request Reviews: Automated code review with contextual feedback and suggestions
  • Issue Triage: Automated labeling and prioritization of GitHub issues based on content analysis
  • On-demand Assistance: Mention @costrict-cli in issues and pull requests for help with debugging, explanations, or task delegation
  • Custom Workflows: Build automated, scheduled and on-demand workflows tailored to your team's needs

🔐 Authentication Options

Choose the authentication method that best fits your needs:

Option 1: Custom LLM (OpenAI Protocol)

✨ Best for: Using alternative LLM providers or self-hosted models

Benefits:

  • Flexibility: Use any OpenAI-compatible API (GPT-4, Claude via OpenRouter, local models, etc.)
  • Cost control: Choose providers based on pricing
  • Privacy: Run models locally or on your own infrastructure
  • Provider choice: Access to hundreds of models through services like OpenRouter
# Example: Using OpenAI GPT-4
export USE_CUSTOM_LLM=true
export CUSTOM_LLM_API_KEY="sk-..."
export CUSTOM_LLM_ENDPOINT="https://api.openai.com/v1"
export CUSTOM_LLM_MODEL_NAME="gpt-4"
costrict-cli

Supported providers:

  • OpenAI, Azure OpenAI, OpenRouter
  • Self-hosted: Ollama, vLLM, text-generation-webui
  • Chinese providers: Doubao, Qwen, Kimi, DeepSeek, etc.

See the Custom LLM Integration Guide for detailed setup instructions.

Option 2: Login with Google (OAuth login using your Google Account)

✨ Best for: Individual developers as well as anyone who has a Gemini Code Assist License. (see quota limits and terms of service for details)

Benefits:

  • Free tier: 60 requests/min and 1,000 requests/day
  • Gemini 2.5 Pro with 1M token context window
  • No API key management - just sign in with your Google account
  • Automatic updates to latest models

Start CoStrict CLI, then choose Login with Google and follow the browser authentication flow when prompted

costrict-cli

If you are using a paid Code Assist License from your organization, remember to set the Google Cloud Project

# Set your Google Cloud Project
export GOOGLE_CLOUD_PROJECT="YOUR_PROJECT_ID"
costrict-cli

Option 3: Gemini API Key

✨ Best for: Developers who need specific model control or paid tier access

Benefits:

  • Free tier: 100 requests/day with Gemini 2.5 Pro
  • Model selection: Choose specific Gemini models
  • Usage-based billing: Upgrade for higher limits when needed
# Get your key from https://aistudio.google.com/apikey
export COSTRICT_API_KEY="YOUR_API_KEY"
costrict-cli

Option 4: Vertex AI

✨ Best for: Enterprise teams and production workloads

Benefits:

  • Enterprise features: Advanced security and compliance
  • Scalable: Higher rate limits with billing account
  • Integration: Works with existing Google Cloud infrastructure
# Get your key from Google Cloud Console
export GOOGLE_API_KEY="YOUR_API_KEY"
export GOOGLE_GENAI_USE_VERTEXAI=true
costrict-cli

Configuration Details

All authentication credentials and settings are stored in:

~/.costrict/settings.json

You can also use environment variables for configuration, which take precedence over the settings file.

For Google Workspace accounts and other authentication methods, see the authentication guide.

🔌 Using Custom/Third-Party Models

CoStrict CLI supports any OpenAI-compatible API, allowing you to use alternative LLM providers or self-hosted models.

Quick Setup

Set these environment variables before running costrict-cli:

export USE_CUSTOM_LLM=true
export CUSTOM_LLM_API_KEY="your-api-key"
export CUSTOM_LLM_ENDPOINT="https://api.provider.com/v1"
export CUSTOM_LLM_MODEL_NAME="model-name"

Provider Examples

OpenAI GPT-4

export USE_CUSTOM_LLM=true
export CUSTOM_LLM_API_KEY="sk-..."
export CUSTOM_LLM_ENDPOINT="https://api.openai.com/v1"
export CUSTOM_LLM_MODEL_NAME="gpt-4"
costrict-cli

Claude via OpenRouter

export USE_CUSTOM_LLM=true
export CUSTOM_LLM_API_KEY="sk-or-v1-..."
export CUSTOM_LLM_ENDPOINT="https://openrouter.ai/api/v1"
export CUSTOM_LLM_MODEL_NAME="anthropic/claude-3.5-sonnet"
costrict-cli

Ollama (Local)

# Start Ollama first: ollama run llama3
export USE_CUSTOM_LLM=true
export CUSTOM_LLM_API_KEY="ollama"  # Can be any string
export CUSTOM_LLM_ENDPOINT="http://localhost:11434/v1"
export CUSTOM_LLM_MODEL_NAME="llama3"
costrict-cli

Chinese Providers

Zhipu AI (GLM)

export USE_CUSTOM_LLM=true
export CUSTOM_LLM_API_KEY="your-api-key"
export CUSTOM_LLM_ENDPOINT="https://open.bigmodel.cn/api/paas/v4"
export CUSTOM_LLM_MODEL_NAME="GLM-4-Flash"
costrict-cli

Doubao (ByteDance)

export USE_CUSTOM_LLM=true
export CUSTOM_LLM_API_KEY="your-api-key"
export CUSTOM_LLM_ENDPOINT="https://ark.cn-beijing.volces.com/api/v3"
export CUSTOM_LLM_MODEL_NAME="doubao-pro-32k"
costrict-cli

Qwen (Alibaba)

export USE_CUSTOM_LLM=true
export CUSTOM_LLM_API_KEY="sk-..."
export CUSTOM_LLM_ENDPOINT="https://dashscope.aliyuncs.com/compatible-mode/v1"
export CUSTOM_LLM_MODEL_NAME="qwen-plus"
costrict-cli

Advanced Configuration

Optional Parameters

# Set temperature (0.0-2.0)
export CUSTOM_LLM_TEMPERATURE="0.7"

# Set maximum tokens
export CUSTOM_LLM_MAX_TOKENS="4096"

# Set top-p sampling
export CUSTOM_LLM_TOP_P="0.95"

Persistent Configuration

Save settings to ~/.costrict/settings.json:

{
  "useCustomLLM": true,
  "customLLM": {
    "apiKey": "your-api-key",
    "endpoint": "https://api.provider.com/v1",
    "modelName": "gpt-4",
    "temperature": 0.7,
    "maxTokens": 4096
  }
}

Note: Environment variables take precedence over settings in settings.json.

Troubleshooting Custom Models

If you encounter issues:

  1. Check endpoint format: Must end with /v1 for OpenAI compatibility
  2. Verify API key: Ensure it's valid and has correct permissions
  3. Test connection: Use curl to verify the endpoint works
  4. Enable debug logging: Set COSTRICT_DEBUG_LOG_FILE environment variable
export COSTRICT_DEBUG_LOG_FILE=/tmp/costrict-debug.log
costrict-cli

See Custom LLM Integration Guide for more details.

🚀 Getting Started

Basic Usage

Start in current directory

costrict-cli

Include multiple directories

costrict-cli --include-directories ../lib,../docs

Use specific model

costrict-cli -m gemini-2.5-flash

Non-interactive mode for scripts

Get a simple text response:

costrict-cli -p "Explain the architecture of this codebase"

For more advanced scripting, including how to parse JSON and handle errors, use the --output-format json flag to get structured output:

costrict-cli -p "Explain the architecture of this codebase" --output-format json

For real-time event streaming (useful for monitoring long-running operations), use --output-format stream-json to get newline-delimited JSON events:

costrict-cli -p "Run tests and deploy" --output-format stream-json

Quick Examples

Start a new project

cd new-project/
costrict-cli
> Write me a Discord bot that answers questions using a FAQ.md file I will provide

Analyze existing code

git clone https://github.com/zgsm-ai/costrict-cli
cd costrict-cli
costrict-cli
> Give me a summary of all of the changes that went in yesterday

📚 Documentation

Getting Started

Core Features

Tools & Extensions

Advanced Topics

Troubleshooting & Support

  • Troubleshooting Guide - Common issues and solutions.
  • FAQ - Frequently asked questions.
  • Use /bug command to report issues directly from the CLI.

Using MCP Servers

Configure MCP servers in ~/.costrict/settings.json to extend CoStrict CLI with custom tools:

> @github List my open pull requests
> @slack Send a summary of today's commits to #dev channel
> @database Run a query to find inactive users

See the MCP Server Integration guide for setup instructions.

🛠️ Local Development

Setting Up Development Environment

Prerequisites

  • Node.js 20.19.2 or higher (check with node --version)
  • npm package manager
  • Git

Clone and Install

# Clone the repository
git clone https://github.com/zgsm-ai/costrict-cli.git
cd costrict-cli

# Install dependencies
npm install

# Build the project
npm run build

Development Workflow

Build and Link for Local Testing

To test your changes locally without publishing:

# Build the project
npm run build

# Link globally (makes 'costrict-cli' command use your local version)
npm link

# Now run costrict CLI with your changes
costrict-cli

After linking, any costrict-cli command will use your local development version.

Watch Mode for Development

For continuous development with automatic rebuilds:

# Start watch mode (rebuilds on file changes)
npm run build:watch

# In another terminal, test your changes
costrict-cli

Unlink Development Version

When you're done developing and want to use the published version:

# Unlink local version
npm unlink -g costrict-cli

# Install published version
npm install -g costrict-cli

Project Structure

costrict-cli/
├── packages/
│   ├── cli/              # CLI application (entry point)
│   │   ├── src/
│   │   │   └── gemini.tsx  # Main CLI component
│   │   └── package.json
│   └── core/             # Core functionality
│       ├── src/
│       │   ├── custom_llm/  # Custom LLM integration
│       │   ├── tools/       # Built-in tools
│       │   └── types/       # TypeScript types
│       └── package.json
├── docs/                 # Documentation
├── tests/               # Test files
└── package.json         # Root package.json

Running Tests

# Run all tests
npm test

# Run tests in watch mode
npm run test:watch

# Run specific test file
npm test -- converter.test.ts

# Run tests with coverage
npm run test:coverage

Testing Custom LLM Changes

When developing custom LLM integration:

# Set up environment variables
export USE_CUSTOM_LLM=true
export CUSTOM_LLM_API_KEY="your-test-key"
export CUSTOM_LLM_ENDPOINT="http://localhost:11434/v1"
export CUSTOM_LLM_MODEL_NAME="test-model"

# Enable debug logging
export COSTRICT_DEBUG_LOG_FILE=/tmp/costrict-dev.log

# Run your local build
npm run build && costrict-cli

# Check debug logs
tail -f /tmp/costrict-dev.log

Common Development Tasks

Adding a New Feature

  1. Create a feature branch
git checkout -b feature/my-new-feature
  1. Make your changes in appropriate package (packages/cli or packages/core)

  2. Build and test

npm run build
npm test
npm link  # Test locally
costrict-cli    # Try your feature
  1. Commit and push
git add .
git commit -m "feat: add new feature"
git push origin feature/my-new-feature

Debugging Tips

Enable verbose logging:

export COSTRICT_DEBUG_LOG_FILE=/tmp/costrict-debug.log
export DEBUG=*
costrict-cli

Use Node.js debugger:

# Add debugger statement in your code
node --inspect-brk $(which costrict-cli)

Check TypeScript types:

npm run check-types

Code Quality

# Run linter
npm run lint

# Auto-fix linting issues
npm run lint:fix

# Format code
npm run format

Configuration File Locations

During development, CoStrict CLI uses these locations:

  • Settings: ~/.costrict/settings.json
  • Cache: ~/.costrict/cache/
  • Checkpoints: ~/.costrict/checkpoints/
  • Logs: Location specified by COSTRICT_DEBUG_LOG_FILE environment variable

You can modify settings.json to configure your development instance:

{
  "useCustomLLM": true,
  "customLLM": {
    "endpoint": "http://localhost:11434/v1",
    "modelName": "llama3",
    "apiKey": "dev-key"
  },
  "telemetry": false
}

Helpful Development Commands

# Clean build artifacts
npm run clean

# Full clean and rebuild
npm run clean && npm install && npm run build

# Check for outdated dependencies
npm outdated

# Update dependencies
npm update

# Publish to npm (maintainers only)
npm publish

Working with Monorepo Packages

This project uses a monorepo structure with multiple packages:

# Install dependencies for all packages
npm install

# Build specific package
npm run build --workspace=packages/core

# Run command in specific package
npm run test --workspace=packages/cli

Tips for Contributing

  1. Always build before testing: npm run build
  2. Use npm link for local testing: Faster than reinstalling
  3. Enable debug logging: Helps track down issues
  4. Test with multiple providers: Custom LLM, Gemini API, OAuth
  5. Check TypeScript types: npm run check-types before committing
  6. Write tests: Add tests for new features in tests/

See the Local Development Guide for more advanced topics.

Package Publishing (For Maintainers)

Step 1: Prepare the Release Package

This will automatically execute the build and precompilation process:

npm run prepare:package

What it does:

  1. Copy files to subpackages (core, cli)
  2. Run npm run build --workspaces - Build all subpackages first to ensure latest versions
  3. Run npm run bundle - Build main package (referencing the latest subpackages)
  4. Run npm run prebuild - Prepare precompiled binary files

Step 2: Verify Package Contents

# Test packaging (won't create actual files)
npm pack --dry-run

# Actually package
npm pack

Check the generated .tgz file:

tar -tzf zgsm-costrict-cli-0.0.2.tgz | grep -E "(bundle|prebuilds|README|LICENSE)"

You should see:

  • package/bundle/gemini.js - Main program
  • package/prebuilds/darwin-arm64/ - macOS ARM precompiled files
  • package/prebuilds/darwin-x64/ - macOS Intel precompiled files
  • package/prebuilds/linux-x64/ - Linux precompiled files
  • package/prebuilds/win32-x64/ - Windows precompiled files
  • package/README.md - Documentation
  • package/LICENSE - License

Step 3: Publish to npm

cd costrict-cli

# Login to npm (if not already logged in)
npm whoami || npm login

# Simulate publishing (won't actually publish)
npm publish --access public --dry-run

# After confirming everything is correct, publish for real
npm publish --access public

Notes:

  • Scoped packages (@zgsm/xxx) require --access public to be publicly published
  • Ensure you have publishing permissions for the @zgsm scope

Subpackage Dependency Management

Problem Description

In the monorepo structure:

  • The main package (root directory's package.json) is bundled into bundle/gemini.js via esbuild
  • During bundling, it references subpackages like @zgsm/costrict-cli-core, @zgsm/costrict-cli, etc.
  • Must ensure that the bundling references the latest built subpackage code

Solution

prepare-package.js is configured to execute in the following order:

// 1. First build all subpackages
npm run build --workspaces

// 2. Then build the main package (now referencing the latest subpackages)
npm run bundle

// 3. Finally prepare precompiled files
npm run prebuild

Verification Method

Check that the build order is correct:

npm run prepare:package 2>&1 | grep "Building"

# Should see output indicating:
# 1️⃣ Building all subpackages...
# ✓ Subpackages built
# 2️⃣ Building main package...
# ✓ Main package built
# 3️⃣ Preparing precompiled binaries...

Manual Verification

If you need to manually verify that subpackages are up-to-date:

# 1. Check subpackage build time
ls -la packages/core/dist/
ls -la packages/cli/dist/

# 2. Check main package build time
ls -la bundle/

# bundle/ modification time should be later than packages/*/dist/

🤝 Contributing

We welcome contributions! CoStrict CLI is fully open source (Apache 2.0), and we encourage the community to:

  • Report bugs and suggest features.
  • Improve documentation.
  • Submit code improvements.
  • Share your MCP servers and extensions.

See our Contributing Guide for development setup, coding standards, and how to submit pull requests.

📖 Resources

Uninstall

See the Uninstall Guide for removal instructions.

📂 Configuration File Locations

CoStrict CLI stores its configuration and data in the following locations:

| Type | Location | Description | | ------------------- | --------------------------------- | ------------------------------------------------------- | | Settings | ~/.costrict/settings.json | Main configuration file (auth, custom LLM, preferences) | | Cache | ~/.costrict/cache/ | Token cache and temporary data | | Checkpoints | ~/.costrict/checkpoints/ | Saved conversation sessions | | MCP Config | ~/.costrict/settings.json | MCP server configurations (in settings file) | | Debug Logs | Set via COSTRICT_DEBUG_LOG_FILE | Development and troubleshooting logs | | Project Context | ./COSTRICT.md | Per-project instructions (in your project root) |

Example settings.json

{
  "authMethod": "oauth",
  "useCustomLLM": false,
  "customLLM": {
    "apiKey": "",
    "endpoint": "https://api.openai.com/v1",
    "modelName": "gpt-4"
  },
  "telemetry": true,
  "mcpServers": {
    "github": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-github"]
    }
  }
}

Note: Environment variables (e.g., USE_CUSTOM_LLM, CUSTOM_LLM_API_KEY) take precedence over values in settings.json.

📄 Legal