npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@janek26/aimux

v1.0.2

Published

One local gateway for AI models and MCP servers across coding agents.

Downloads

332

Readme

aimux

npm version CI License: MIT

AI tools all want their own provider config, model names, MCP servers, and local files. aimux keeps that moving target in one config and exposes it as one local gateway: OpenAI-compatible /v1 for models, plus one muxed MCP endpoint for tools.

Use the same local AI stack across Cursor, Zed, Claude Code, Codex, Gemini CLI, OpenCode, and anything else that speaks OpenAI or MCP.

Demo

Features

  • One .aimux.yml for providers, model routes, and MCP servers.
  • One OpenAI-compatible gateway at /v1/models and /v1/chat/completions.
  • One MCP gateway that muxes tools, prompts, and resources from many servers.
  • Client config generation for common AI coding tools.
  • User service management for macOS LaunchAgent and Linux systemd.

Quick Start

bunx @janek26/aimux@latest help

bun install -g @janek26/aimux

aimux init
aimux llm add fallback --name openai --preset openai --token "$OPENAI_API_KEY"
aimux serve --port 8787

After installing globally, use aimux directly. bunx aimux help also works once the global binary is on your PATH.

Then use aimux like an OpenAI-compatible API:

curl http://localhost:8787/v1/models

curl http://localhost:8787/v1/chat/completions \
  -H 'content-type: application/json' \
  -d '{"model":"gpt-4o-mini","messages":[{"role":"user","content":"Hello"}]}'

init creates .aimux.yml. Config lookup walks up from the current directory, then falls back to ~/.aimux.yml.

MCP

aimux mcp add github \
  --transport stdio \
  --command npx \
  --args "-y,@modelcontextprotocol/server-github" \
  --env "GITHUB_PERSONAL_ACCESS_TOKEN=$GITHUB_PERSONAL_ACCESS_TOKEN"

aimux serve mcp

Remote MCP servers can use OAuth, bearer tokens, custom headers, method filters, and method renames.

Client Config

aimux generate all

Targets: opencode, cursor, zed, claude-code, codex, and gemini-cli.

Generated client config is local machine state and is ignored by git. Zed currently reads language_models only from ~/.config/zed/settings.json; aimux writes project-local Zed MCP config and asks before updating global Zed model settings.

Service

aimux service enable
aimux service logs
aimux service load ./path/to/.aimux.yml
aimux service restart
aimux service disable
aimux service uninstall

The service runs aimux serve against ~/.aimux.yml. Logs go to ~/Library/Logs/aimux/aimux.log on macOS and ~/.local/state/aimux/aimux.log on Linux.

Config

Do not commit real .aimux.yml files. They can contain API keys, MCP headers, OAuth access tokens, and refresh tokens. Use .aimux.example.yml as a safe starting point.

providers:
  openai-prod:
    preset: openai
    token: <OPENAI_API_KEY>

llm:
  custom/prod:
    provider: openai-prod
    model: gpt-4o
  fallback:
    - provider: openai-prod

mcp:
  local-files:
    transport: stdio
    command: mcp-server-filesystem
    args: ["."]

Compatibility

OpenAI-compatible providers are proxied without rewriting request or response bodies except for configured model remapping. Streaming, tool calls, structured outputs, image inputs, and provider-specific fields pass through when the upstream supports them.

Anthropic providers are adapted through the Messages API for basic chat-completion compatibility. Full Anthropic multimodal and tool-use normalization is future work.

Development

bun install
bun run check
bun run build

Releases are published exclusively by GitHub Actions through semantic-release. Add NPM_TOKEN to the repository secrets before pushing a conventional commit to main.

See docs/PROJECT.md for architecture notes.

License

MIT