npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

neurospark-coder

v1.2.12

Published

Run Claude Code with NeuroSpark and other OpenAI-compatible providers.

Readme

neurospark-coder

NPM Version

Use Claude Code with OpenAI, Google, xAI, and other providers.

  • Extremely simple setup - just a basic command wrapper
  • Uses the AI SDK for simple support of new providers
  • Works with Claude Code GitHub Actions
  • Optimized for OpenAI's gpt-5 series

Get Started

# Use your favorite package manager (bun, pnpm, and npm are supported)
$ pnpm install -g neurospark-coder

# neurospark-coder is a wrapper for the Claude CLI
# `neurospark/`, `google/`, `xai/`, and `anthropic/` are supported
$ neurospark-coder --model neurospark/GLM5-FP8

Switch models in the Claude UI with /model neurospark/GLM5-FP8 (or /model neurospark/GLM5.1-FP8 for the 5.1 checkpoint).

Bundled Skills

neurospark-coder can also ship sanitized reusable skills alongside the CLI. Users can inspect the packaged skills:

neurospark-coder --list-skills

Install them into the default local skills directory:

neurospark-coder --install-skills

By default this installs into an NScoder-skills folder in your current working directory. To install somewhere else:

neurospark-coder --install-skills --skills-dir /path/to/skills

Existing skill folders are left untouched unless the user explicitly asks to replace them:

neurospark-coder --install-skills --force

The bundled FHW skills in this repo are intentionally sanitized. Users still need to fill in their own internal hostnames, credentials, and foundry/library paths before using those workflows in a real lab environment.

Skills 使用说明(中文)

neurospark-coder 可以随 CLI 一起分发一组可复用的技能(skills)。这些技能主要用于硬件设计相关流程,例如:

  • ssh-server
  • compile-design
  • design-lint
  • run-sim
  • genus-synthesis

先查看当前打包的技能列表:

neurospark-coder --list-skills

将技能安装到当前目录下默认的 NScoder-skills 文件夹:

neurospark-coder --install-skills

如果你想安装到指定目录,可以这样执行:

neurospark-coder --install-skills --skills-dir /path/to/skills

如果目标目录中已经存在同名技能,默认会跳过,不会覆盖。若要强制覆盖:

neurospark-coder --install-skills --force

如何使用这些 skills

安装完成后,你会在目标目录下看到对应的技能文件夹,例如:

NScoder-skills/
├── ssh-server/
├── compile-design/
├── design-lint/
├── run-sim/
└── genus-synthesis/

每个技能目录里至少会有一个 SKILL.md 文件。你可以先打开阅读它的说明,了解这个 skill 适用于什么任务、需要哪些输入参数,以及默认的工作流程。

例如:

sed -n '1,120p' NScoder-skills/genus-synthesis/SKILL.md

这些 skills 的典型使用方式是:在 Codex 或 Claude Code 对话里,直接描述任务,并点名你要使用的 skill。模型会根据 SKILL.md 里的说明来执行。

例如:

请使用 genus-synthesis skill,帮我在远端服务器上跑这个设计的 Genus synthesis。
请使用 design-lint skill,先帮我找这个 repo 里面实际在用的 lint flow,再检查我刚改过的 RTL。
请使用 run-sim skill,先搜索这个项目真正使用的 top testbench,如果有多个候选请先列出来让我选。

推荐的实际使用流程:

  1. 先用 --list-skills 确认当前有哪些 skills。
  2. --install-skills 把 skills 安装到本地目录。
  3. 打开目标 skill 的 SKILL.md,确认它需要的输入信息。
  4. 在对话中明确说明你要使用哪个 skill,并给出 repo 路径、远端路径、top module、testbench、tool path 等必要信息。
  5. 如果 skill 是远端服务器相关的,先补齐你自己的主机地址、用户名、认证方式、library 路径等环境配置。

这些通用化后的硬件 skills 默认不会盲目假设设计名称、top module、testbench 或脚本路径,而是会优先搜索当前 repo 的真实入口;如果存在多个候选,它应该先列出来再让你确认。

本仓库内置的技能内容已经做过去敏处理,不包含真实的内网地址、账号密码、EDA license、foundry library 路径等敏感信息。要在真实环境中使用,请先根据你自己的实验室或服务器环境补充这些配置。

One-Command Install

For a private GitHub repo with a public customer install flow, publish the neurospark-coder package to the public npm registry and host scripts/install-neurospark-coder.sh at a stable public URL such as https://install.neuro-spark.ai/install.sh.

Then customers can install both Claude Code and neurospark-coder with:

curl -fsSL https://install.neuro-spark.ai/install.sh | bash

By default, the installer runs Anthropic's official Claude Code installer and then:

npm install -g neurospark-coder

If you ever need to install from a tarball or alternate package source, set PACKAGE_SOURCE:

PACKAGE_SOURCE=https://your-download-url/neurospark-coder-1.2.2.tgz \
curl -fsSL https://install.neuro-spark.ai/install.sh | bash

Publish To npm

This repo includes publish-npm.yml for publishing the package from a private GitHub repository to the public npm registry.

One-time setup

  1. Create an npm access token with publish permissions.
  2. Add it to this GitHub repository as the NPM_TOKEN Actions secret.
  3. Host scripts/install-neurospark-coder.sh at your public install URL.

Release flow

  1. Bump the version in package.json.
  2. Push a tag that matches the package version, for example:
git tag v1.2.2
git push origin v1.2.2
  1. GitHub Actions will run bun install, bun run typecheck, bun run build, and npm publish --access public.

After the workflow finishes, new installs and upgrades will use the latest npm package automatically:

npm install -g neurospark-coder@latest

If you prefer to publish manually from a machine that already has npm access:

bun install
bun run typecheck
bun run build
npm publish --access public

Optional GitHub Release Assets

This repo also includes release.yml if you want a GitHub Releases-based distribution path. That is optional for the public npm install flow and is not required for customers using the Cloudflare or other hosted install.sh URL.

Configure API Key

For the NeuroSpark deployment, users only need to set OPENAI_API_KEY. The proxy already defaults to https://api.neuro-spark.ai/v1, so OPENAI_API_URL is optional.

export OPENAI_API_KEY="your-neurospark-api-key"
neurospark-coder --model neurospark/GLM5-FP8

To restrict the proxy to a single NeuroSpark model, set SUPPORTED_MODELS:

export OPENAI_API_KEY="your-neurospark-api-key"
export SUPPORTED_MODELS="neurospark/GLM5-FP8"
neurospark-coder --model neurospark/GLM5-FP8

By default, the proxy exposes both neurospark/GLM5-FP8 and neurospark/GLM5.1-FP8. Users only need to set SUPPORTED_MODELS if they want to override that default.

GPT-5 Support

Use --reasoning-effort (alias: -e) to control OpenAI reasoning.effort. Allowed values: minimal, low, medium, high.

neurospark-coder --model neurospark/GLM5-FP8 -e high

Use --service-tier (alias: -t) to control OpenAI service tier. Allowed values: flex, priority.

neurospark-coder --model neurospark/GLM5-FP8 -t priority

Note these flags may be extended to other providers in the future.

FAQ

Additional documentation:

What providers are supported?

See the providers for the implementation.

  • GOOGLE_API_KEY supports google/* models.
  • OPENAI_API_KEY supports neurospark/* models through the OpenAI-compatible adapter.
  • XAI_API_KEY supports xai/* models.

Set a custom OpenAI endpoint with OPENAI_API_URL to use OpenRouter

ANTHROPIC_MODEL and ANTHROPIC_SMALL_MODEL are supported with the <provider>/ syntax.

How do I restrict which models clients can use?

Set SUPPORTED_MODELS to a comma-separated list of full model IDs exposed by the proxy.

SUPPORTED_MODELS=neurospark/GLM5-FP8,neurospark/GLM5.1-FP8

With this set:

  • GET /v1/models returns only the configured models.
  • POST /v1/messages returns a 400 invalid_request_error if model is not in the allowlist.
  • If SUPPORTED_MODELS is unset, the proxy defaults to neurospark/GLM5-FP8 and neurospark/GLM5.1-FP8.

How does this work?

Claude Code has added support for customizing the Anthropic endpoint with ANTHROPIC_BASE_URL.

neurospark-coder spawns a simple HTTP server that translates between Anthropic's format and the AI SDK format, enabling support for any AI SDK provider (e.g., Google, OpenAI, etc.)

When launching Claude Code, neurospark-coder also sets a placeholder ANTHROPIC_AUTH_TOKEN if you have not already configured ANTHROPIC_API_KEY or ANTHROPIC_AUTH_TOKEN, and defaults CLAUDE_CODE_DISABLE_NONESSENTIAL_TRAFFIC=1. This matches the gateway/proxy auth pattern supported by Claude Code and avoids unnecessary account login prompts for local proxy usage.