npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

ai-sdk-zai-provider

v0.1.1

Published

Drop-in Z.AI/GLM provider for the Vercel AI SDK powered by ai-sdk-provider-claude-code with automatic env + MCP wiring.

Downloads

7

Readme

ai-sdk-zai-provider

Expert-friendly Vercel AI SDK provider for Z.AI / GLM. It reuses the hardened ai-sdk-provider-claude-code transport, patches the Anthropic env vars that Z.AI expects, and wires the documented MCP servers automatically.

Capabilities

  • Claude Code CLI (zaiClaudeCode / createZaiClaudeCode): drop-in CLI parity with web-search, reader/WebFetch, and vision MCP servers enabled, including guardrails for custom tools.
  • Anthropic HTTP (zaiAnthropic / createZaiAnthropic): deterministic HTTP surface for custom tools, hosted agents, and CI harnesses.
  • Force-only custom tools: forceCustomTools helper simulates the assistant tool-call flow so the CLI only sees your tool.
  • Typed options: env merging, model remaps, MCP/vision process control, and permission defaults are all surfaced via TypeScript.

Install

npm install ai-sdk-zai-provider ai
# ensure ZAI_API_KEY is set in your shell or .env

Usage

Claude Code + MCP flows

import { generateText } from 'ai';
import { zaiClaudeCode } from 'ai-sdk-zai-provider';

const result = await generateText({
  model: zaiClaudeCode('glm-4.6'),
  prompt: 'List three threat-modeling checks we should automate.',
});

HTTP + deterministic custom tools

import { streamText } from 'ai';
import { zaiAnthropic } from 'ai-sdk-zai-provider';

const result = streamText({
  model: zaiAnthropic('glm-4.6'),
  tools: { /* Anthropic-style tool map */ },
  messages: [
    { role: 'system', content: 'You are a helpful assistant.' },
    { role: 'user', content: 'Summarize the latest GLM release.' },
  ],
});

Pass GLM SKUs (glm-4.6, glm-4.5-air, etc.). The provider rewrites them to the Claude-compatible fallback id until Z.AI exposes the spec v2 HTTP surface. Override via glmFallbackModel, ZAI_HTTP_FALLBACK_MODEL, or set it to false once Z.AI supports GLM IDs natively.

createZaiClaudeCode overview

import { createZaiClaudeCode } from 'ai-sdk-zai-provider';

const provider = createZaiClaudeCode({
  anthropicBaseUrl: 'https://api.z.ai/api/anthropic',
  includeDefaultAllowedTools: true,
  enableWebSearchMcp: true,
  enableWebReaderMcp: true,
  enableVisionMcp: true,
  customToolsOnly: {
    allowedTools: ['echo_tool'],
    appendSystemPrompt: 'Do not call Bash/Task.',
  },
});

const model = provider('glm-4.6');

Every model inherits:

  • Z.AI env vars (ANTHROPIC_*, API_TIMEOUT_MS, per-alias mappings).
  • Claude Code default tools + optional MCP tool ids (mcp__web-search-prime__webSearchPrime, mcp__web-reader__webReader, WebFetch, mcp__zai-vision__*).
  • MCP definitions pointing at the Z.AI endpoints or the stdio @z_ai/mcp-server. Override visionCommand to use a different runner.
  • permissionMode: 'bypassPermissions' unless overridden.

Streaming & telemetry parity

  • Emits the same stream structure (start, text-delta, tool-input-*, tool-call, tool-result, finish).
  • result.usage and result.providerMetadata['claude-code'] stay intact for billing/analytics.
  • HTTP provider mirrors CLI blocks so swapping surfaces doesn’t require schema changes.

Tests

npm test                 # unit + mocked integration suites (requires ZAI_API_KEY)
npm run test:integration # focus on the integration spec

Coverage includes env merging, MCP wiring, deterministic custom tools via HTTP, and stream-shape validation. Set ZAI_API_KEY (and optionally ZAI_HTTP_FALLBACK_MODEL if you need a different Claude-compatible fallback). All integration specs run against local fixtures—no flaky external dependencies.

Custom tools

HTTP (recommended)

const tools = {
  repo_search: {
    description: 'Search GitHub repositories for a keyword.',
    parameters: z.object({ query: z.string() }),
    execute: async ({ query }) => {
      const res = await fetch(`https://api.github.com/search/repositories?q=${encodeURIComponent(query)}&per_page=3`);
      const data = await res.json();
      return data.items.map((repo) => ({ name: repo.full_name, stars: repo.stargazers_count }));
    },
  },
};

const result = streamText({
  model: zaiAnthropic('glm-4.6'),
  tools,
  messages: [{ role: 'system', content: 'Use repo_search exactly once before answering.' }],
});

Claude Code guardrail mode

import { createZaiClaudeCode, forceCustomTools } from 'ai-sdk-zai-provider';

const provider = createZaiClaudeCode({
  customToolsOnly: { allowedTools: ['echo_tool'] },
  defaultSettings: { allowedTools: ['echo_tool'] },
});

const tools = {
  echo_tool: {
    description: 'Echo text back for verification.',
    parameters: z.object({ text: z.string() }),
    execute: async ({ text }, options) => ({ echoed: text, callId: options.toolCallId }),
  },
};

await forceCustomTools({
  model: provider('glm-4.6'),
  tools,
  toolName: 'echo_tool',
  toolInput: { text: 'custom_check' },
  messages: [{ role: 'user', content: 'Confirm the custom tool output.' }],
});

forceCustomTools runs your tool once, injects the tool-call/tool-result, and hands control back to the CLI model with Bash/Task disabled.

License

MIT © 2025