npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

copilot-proxy

v1.3.3

Published

HTTP proxy for GitHub Copilot API

Readme

Copilot Proxy

A simple HTTP proxy that exposes your GitHub Copilot free quota as an OpenAI-compatible API.

Why?

  • You have a lot of free quota on GitHub Copilot, you want to use it like OpenAI-compatible APIs.
  • You want the computing power of GitHub Copilot beyond VS Code.
  • You want to use modern models like gpt-4.1 free.
  • You have multiple GitHub accounts and the free quota is just wasted.
  • Host LLM locally and leave the computing remotely.

Features

  • Proxies requests to https://api.githubcopilot.com
    • Support endpoints: /chat/completions, /models
  • User-friendly admin UI:
    • Log in with GitHub and generate tokens
    • Add tokens manually
    • Manage multiple tokens with ease
    • View chat message and code completion usage statistics
  • Supports Langfuse for LLM observability

How to use

  • Start the proxy server
    • Option 1: Use Docker
      docker run -p 3000:3000 ghcr.io/hankchiutw/copilot-proxy:latest
    • Option 2: Use pnpx(recommended) or npx
      pnpx copilot-proxy
  • Browse http://localhost:3000 to generate the token by following the instructions.
    • Or add your own token manually.
  • Set a default token.
  • Your OpenAI-compatible API base URL is http://localhost:3000/api
    • You can test it like this: (no need authorization header since you've set a default token!)
    curl --request POST --url http://localhost:3000/api/chat/completions --header 'content-type: application/json' \
    --data '{
        "model": "gpt-4",
        "messages": [{"role": "user", "content": "Hi"}]
    }'
    • You still can set a token in the request header authorization: Bearer <token> and it will override the default token.
  • (Optional) Use environment variable PORT for setting different port other than 3000.

Available environment variables

  • PORT: Port number to listen on (default: 3000)
  • LOG_LEVEL: Log level (default: info)
  • STORAGE_DIR: Directory to store tokens (default: .storage)
    • Be sure to backup this directory if you want to keep your tokens.
    • Note: even if you delete the storage folder, the token is still functional from GitHub Copilot. (That is how Github Copilot works at the moment.)
  • Langfuse is supported, see official documentation for more details.
    • LANGFUSE_SECRET_KEY: Langfuse secret key
    • LANGFUSE_PUBLIC_KEY: Langfuse public key
    • LANGFUSE_BASEURL: Langfuse base URL (default: https://cloud.langfuse.com)

Advanced usage

  • Dummy token _ to make copilot-proxy use the default token.
    • In most cases, the default token just works without 'Authorization' header. But if your LLM client requires a non-empty API key, you can use the special dummy token _ to make copilot-proxy use the default token.
  • Tips for using docker:
    • Mount the storage folder from host to persist the tokens and use .env file to set environment variables
      docker run -p 3000:3000 -v /path/to/storage:/app/.storage -v /path/to/.env:/app/.env ghcr.io/hankchiutw/copilot-proxy:latest

Use cases

Requirements

  • Node.js 22 or higher

References

  • https://www.npmjs.com/package/@github/copilot-language-server
  • https://github.com/B00TK1D/copilot-api
  • https://github.com/ericc-ch/copilot-api
  • https://hub.docker.com/r/mouxan/copilot

Licensed under the MIT License.