npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@shunirr/cc-glm

v0.1.5

Published

Claude Code proxy for switching between Anthropic API and z.ai GLM

Readme

cc-glm

Claude Code proxy for routing requests between Anthropic API and z.ai GLM.

Features

  • Configurable model routing: Route requests to different upstreams based on model name patterns with glob matching
  • Model name rewriting: Transparently rewrite model names (e.g., claude-sonnet-*GLM-4.7)
  • Thinking block transformation: Convert z.ai thinking blocks to text blocks to avoid Anthropic signature validation issues
  • Singleton proxy: One proxy instance shared across multiple Claude Code sessions
  • Lifecycle management: Proxy starts/stops automatically with Claude Code
  • YAML configuration: Config file with ${VAR:-default} environment variable expansion

Prerequisites

  • Node.js >= 18
  • Claude Code CLI installed and available in PATH
  • z.ai API key (ZAI_API_KEY env var or config file) if routing to z.ai

Installation

npm install -g @shunirr/cc-glm

Or use with npx:

npx @shunirr/cc-glm

Usage

Use cc-glm as a drop-in replacement for claude:

# Start Claude Code through the proxy
cc-glm

# Pass arguments to Claude Code
cc-glm -c
cc-glm -p "PROMPT"

The proxy automatically:

  1. Starts if not already running (singleton)
  2. Sets ANTHROPIC_BASE_URL to route requests through the proxy
  3. Routes requests based on model name matching rules
  4. Stops when all Claude Code sessions have exited (after a grace period)

Configuration

Create ~/.config/cc-glm/config.yml:

# Claude Code CLI command path (empty = auto-detect from PATH)
claude:
  path: ""

proxy:
  port: 8787
  host: "127.0.0.1"

upstream:
  # Anthropic API (OAuth, forwards authorization header as-is)
  anthropic:
    url: "https://api.anthropic.com"

  # z.ai GLM API
  zai:
    url: "https://api.z.ai/api/anthropic"
    apiKey: "YOUR_API_KEY" # Or falls back to ZAI_API_KEY env var

lifecycle:
  stopGraceSeconds: 8
  startWaitSeconds: 8
  stateDir: "${TMPDIR}/claude-code-proxy"

logging:
  level: "info"  # debug, info, warn, error

# Rules are evaluated top-to-bottom, first match wins
routing:
  rules:
    - match: "claude-sonnet-*"
      upstream: zai
      model: "GLM-4.7"

    - match: "claude-haiku-*"
      upstream: zai
      model: "GLM-4.7"

    - match: "glm-*"
      upstream: zai

  default: anthropic

Configuration Options

claude.path

Path to the Claude Code CLI executable. If empty or not specified, cc-glm will auto-detect the command from your PATH using which (Unix/macOS) or where (Windows).

claude:
  path: "/usr/local/bin/claude"  # Custom path
  # or
  path: ""  # Auto-detect (default)

Without a config file, all requests are routed to Anthropic API (OAuth).

Environment Variables

  • ZAI_API_KEY — z.ai API key (used when config apiKey is empty)
  • ANTHROPIC_BASE_URL — Automatically set by cc-glm to point to the proxy

Model Routing

Routing rules use glob patterns (* wildcard) and are evaluated top-to-bottom. The first matching rule wins. Each rule can optionally rewrite the model name sent to the upstream.

| Rule Pattern | Upstream | Model Sent | |---|---|---| | claude-sonnet-* | z.ai | GLM-4.7 | | claude-haiku-* | z.ai | GLM-4.7 | | glm-* | z.ai | (original) | | (no match) | Anthropic | (original) |

How It Works

  1. cc-glm starts a local HTTP proxy at 127.0.0.1:8787 (singleton via atomic lock directory)
  2. Sets ANTHROPIC_BASE_URL so Claude Code sends API requests through the proxy
  3. The proxy extracts the model name from each request body
  4. Routing rules determine the upstream (Anthropic or z.ai) and optional model rewrite
  5. Auth headers are adjusted per upstream:
    • Anthropic: forwards the original OAuth authorization header
    • z.ai: replaces authorization with x-api-key
  6. z.ai responses have their thinking blocks sanitized (invalid signatures removed), and when later sent to Anthropic, z.ai-origin thinking blocks are converted to text blocks to avoid signature validation errors
  7. After Claude Code exits, the proxy waits a grace period (default 8s) and stops if no other sessions remain

Thinking Block Transformation

Why Transformation Is Needed

Anthropic API validates thinking block signatures — each thinking block includes a cryptographic signature proving it was generated by Anthropic. z.ai thinking blocks lack valid Anthropic signatures, so the proxy must handle them differently to avoid API rejection.

Response Sanitization (z.ai → Claude Code)

When the proxy receives a response from z.ai, it sanitizes thinking blocks by removing invalid signature fields and normalizing the format. The signature store records valid Anthropic signatures so the proxy can distinguish Anthropic-origin thinking blocks from z.ai-origin ones.

Request Transformation (Claude Code → Anthropic)

When sending a request to Anthropic that contains thinking blocks from the conversation history, the proxy checks each block's signature against the signature store:

| Origin | Signature | Action | |--------|-----------|--------| | Anthropic-generated | Recorded in signature store | Passed through as-is | | z.ai-generated | Not in signature store | Converted to text block |

The conversion wraps the thinking content in XML tags:

Before (z.ai thinking block):

{
  "type": "thinking",
  "thinking": "This is my reasoning process...",
  "signature": "invalid_signature_xyz"
}

After (converted to text block):

{
  "type": "text",
  "text": "<previous-glm-reasoning>\nThis is my reasoning process...\n</previous-glm-reasoning>"
}

This preserves the reasoning content while avoiding Anthropic's signature validation. The <previous-glm-reasoning> tags clearly mark the content as historical reasoning from z.ai.

Signature Store

The proxy maintains an in-memory signature store to track valid Anthropic signatures. Configure via signature_store in config:

signature_store:
  maxSize: 1000  # Maximum signatures to store (default: 1000, max: 100000)

Development

npm install
npm run build       # Build with tsup
npm run dev         # Build in watch mode
npm run lint        # Type check (tsc --noEmit)
npm test            # Run tests (watch mode)
npm run test:run    # Run tests once

License

MIT