npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

coxy

v0.0.4

Published

<p align="center"> <picture> <source media="(prefers-color-scheme: dark)" srcset="https://raw.githubusercontent.com/coxy-proxy/coxy/refs/heads/main/assets/header-light.png"> <img alt="Coxy logo" src="https://raw.githubusercontent.com/coxy-proxy/

Downloads

30

Readme

Why?

  • You have a lot of free quota on GitHub Copilot, you want to use it like OpenAI-compatible APIs.
  • You want the computing power of GitHub Copilot beyond VS Code.
  • You want to use modern models like gpt-4.1 free.
  • You have multiple GitHub accounts and the free quota is just wasted.
  • Host LLM locally and leave the computing remotely.

Features

  • Proxies requests to https://api.githubcopilot.com
    • Support endpoints: /chat/completions, /models
  • User-friendly admin UI:
    • Log in with GitHub and generate tokens
    • Add tokens manually
    • Manage multiple tokens with ease
    • View chat message usage statistics
    • Simple chat bot for model evaluation
      • Client-side chat session history

How to use

  • Start the proxy server
    • Option 1: Use Docker
      docker run -p 3000:3000 ghcr.io/coxy-proxy/coxy:latest
    • Option 2: Use pnpx(recommended) or npx
      pnpx coxy
  • Browse http://localhost:3000 to generate the token by following the instructions.
    • Or add your own token manually.
  • Set a default token.
  • Your OpenAI-compatible API base URL is http://localhost:3000/api
    • You can test it like this: (no need authorization header since you've set a default token!)
    curl --request POST --url http://localhost:3000/api/chat/completions --header 'content-type: application/json' \
    --data '{
        "model": "gpt-4",
        "messages": [{"role": "user", "content": "Hi"}]
    }'
    • You still can set a token in the request header authorization: Bearer <token> and it will override the default token.
  • (Optional) Use environment variable PORT for setting different port other than 3000.

Available environment variables

  • PORT: Port number to listen on (default: 3000)
  • LOG_LEVEL: Log level of pino (default: info)
  • DATABASE_URL: Database URL for Prisma (currently only supports sqlite). Should start with file:. (default: file:../coxy.db)
    • The relative path will be resolved to the absolute path at runtime.

Advanced usage

  • Dummy token _ to make coxy use the default token.
    • In most cases, the default token just works without 'Authorization' header. But if your LLM client requires a non-empty API key, you can use the special dummy token _ to make coxy use the default token.
  • Provisioning: launch the CLI with --provision to force initialize the database schema via Prisma.
  • Tips for using docker:
    • Mount the sqlite db file from host to persist the tokens and use .env file to set environment variables. Use --provision first time to initialize the database schema via Prisma, e.g.
      docker run -p 3000:3000 -v /path/to/sqlite.db:/app/coxy.db -v /path/to/.env:/app/.env ghcr.io/coxy-proxy/coxy:latest --provision

Troubleshooting

  • I got status code 400 when using GPT-5-mini model.
    • Make sure you have enabled "OpenAI GPT-5 mini" in your github account Copilot settings. The URL looks like https://github.com/settings/copilot/features.
  • I got the error Fail to load API keys when opening API Keys page.
    • Make sure you have sqlite db file and provisioned correctly.
  • I'm using podman and cannot access via localhost:3000 but can access via 127.0.0.1:3000.
    • It seems podman's known issue or bug. Just use 127.0.0.1:3000 instead or setting another host name, like 127.0.0.1 coxy, in your /etc/hosts file.

Use cases

Requirements

  • Node.js 22 or higher

References

  • https://www.npmjs.com/package/@github/copilot-language-server
  • https://github.com/B00TK1D/copilot-api
  • https://github.com/ericc-ch/copilot-api
  • https://hub.docker.com/r/mouxan/copilot

Licensed under the MIT License.