npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

oca-proxy

v1.0.12

Published

OpenAI-compatible proxy for Oracle Code Assist (OCA)

Readme

OCA Proxy (TypeScript)

OpenAI-compatible proxy server for Oracle Code Assist (OCA).

This proxy handles OCI authentication via web-based OAuth flow and exposes standard OpenAI API endpoints, allowing any OpenAI-compatible tool to use OCA backend models.

Note: Requires Node.js 24 LTS (>=24.0.0 <25).

Quick Start

# Run without installing (recommended)
npx oca-proxy

Or install globally from npm and run:

npm install -g oca-proxy
# oca-proxy

## Git hooks

This repo uses Husky to run checks locally to keep the codebase consistent and healthy.

- Pre-commit: runs Biome autofix (`npm run check`), re-stages changes, then runs `npm run lint` to ensure no remaining issues.
- Pre-push: runs `npm run typecheck` and `npm run build` to catch type errors and build failures before pushing.

Setup:
- Hooks are installed automatically via the `prepare` script when you run `npm install`.
- If hooks are missing, run: `npx husky install`.

Skip hooks temporarily (use sparingly):
- Commit without hooks: `git commit -m "msg" --no-verify`
- Push without hooks: `git push --no-verify`

From Source

cd oca-proxy
npm install
npm run build
npx ./bin/oca-proxy.js

On first run, the browser will automatically open for OAuth login. After authentication, the proxy is ready to use.

Authentication

The proxy uses web-based OAuth with PKCE on whitelisted ports (8669, 8668, 8667).

  • Login: Visit http://localhost:8669/login or it opens automatically on first run
  • Logout: Visit http://localhost:8669/logout
  • Status: Visit http://localhost:8669/health

Tokens are stored in ~/.oca/refresh_token.json.

Usage with OpenAI SDK

from openai import OpenAI

client = OpenAI(
    api_key="dummy",  # Not used, but required by SDK
    base_url="http://localhost:8669/v1"
)

response = client.chat.completions.create(
    model="oca/gpt-4.1",
    messages=[{"role": "user", "content": "Hello!"}],
    stream=True
)

for chunk in response:
    print(chunk.choices[0].delta.content, end="")

Usage with curl

curl http://localhost:8669/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "oca/gpt-4.1",
    "messages": [{"role": "user", "content": "Hello!"}],
    "stream": true
  }'

Environment Variables

| Variable | Default | Description | | -------- | ------- | --------------------------------------------------------- | | PORT | 8669 | Proxy server port (must be 8669, 8668, or 8667 for OAuth) |

Supported Endpoints

OpenAI Format (/v1/...)

| Endpoint | Method | Description | | ---------------------- | ------ | -------------------------------------- | | /v1/models | GET | List available models | | /v1/chat/completions | POST | Chat completions (streaming supported) | | /v1/responses | POST | Responses API (streaming supported) | | /v1/completions | POST | Legacy completions | | /v1/embeddings | POST | Text embeddings |

Anthropic Format (/v1/messages)

| Endpoint | Method | Description | | -------------- | ------ | -------------------------------------------- | | /v1/messages | POST | Anthropic Messages API (streaming supported) |

Other

| Endpoint | Method | Description | | --------- | ------ | ------------------------------- | | / | GET | Dashboard with status and links | | /login | GET | Start OAuth login flow | | /logout | GET | Clear authentication | | /health | GET | Health check |

Model Mapping

Models not starting with oca/ are automatically mapped to oca/gpt-4.1 by default.

Custom mappings can be configured in ~/.config/oca/oca-proxy.config.json:

{
  "model_mapping": {
    "gpt-4": "oca/gpt-4.1",
    "claude-3-opus": "oca/openai-o3"
  }
}

Files

oca-proxy/
├── bin/
│   └── oca-proxy.js   # Standalone CLI - single build output
├── src/
│   ├── index.ts       # Main proxy server with OAuth endpoints
│   ├── auth.ts        # PKCE auth, token manager, OCA headers
│   ├── config.ts      # Configuration and token storage
│   └── logger.ts      # Logging utility
├── package.json
├── tsconfig.json
└── README.md

Running with PM2

PM2 is a production process manager for Node.js applications. You can run the OCA Proxy via the global binary or npx.

  1. Install PM2 globally:

    npm install -g pm2
  2. Start the proxy (choose one):

    • Global install:

      pm2 start oca-proxy --name oca-proxy
    • Using npx (no global install):

      pm2 start "npx oca-proxy" --name oca-proxy
  3. Monitor and manage:

    • View status: pm2 status
    • View logs: pm2 logs oca-proxy
    • Restart: pm2 restart oca-proxy
    • Stop: pm2 stop oca-proxy
    • Delete: pm2 delete oca-proxy

For advanced configuration, create ecosystem.config.js:

module.exports = {
  apps: [
    {
      name: 'oca-proxy',
      // If installed globally:
      script: 'oca-proxy',
      // Or, if you prefer npx, use:
      // script: 'npx',
      // args: 'oca-proxy',
      env: {
        NODE_ENV: 'production',
        PORT: 8669,
      },
    },
  ],
};

Then start with pm2 start ecosystem.config.js.

Releases (GitHub Actions)

Tagged pushes that match v*.*.* trigger a cross-platform build and GitHub Release with prebuilt binaries using @yao-pkg/pkg.

  • Workflow: .github/workflows/release.yml

  • Builds on: Ubuntu, macOS (Node 20)

  • Output release assets:

  • oca-proxy-macos-x64.tar.gz

  • oca-proxy-macos-arm64.tar.gz

  • oca-proxy-linux-x64.tar.gz

  • oca-proxy-linux-arm64.tar.gz

  • How to test builds (Intel and Apple Silicon):

    1. Manually run the workflow without tagging (GitHub → Actions → build-and-release → Run workflow).

    2. Download artifacts for your platform from the run summary.

    3. macOS:

      • Intel: chmod +x oca-proxy-macos-x64 && ./oca-proxy-macos-x64 --help
      • Apple Silicon: chmod +x oca-proxy-macos-arm64 && ./oca-proxy-macos-arm64 --help
      • Test Intel binary on Apple Silicon via Rosetta: arch -x86_64 ./oca-proxy-macos-x64 --help
    4. Linux:

      • x64: chmod +x oca-proxy-linux-x64 && ./oca-proxy-linux-x64 --help
      • arm64: chmod +x oca-proxy-linux-arm64 && ./oca-proxy-linux-arm64 --help
    5. Optional smoke test: start the server and hit the health endpoint:

      • ./oca-proxy-<platform-arch> &
      • curl -s http://localhost:8669/health

Cut a release:

# 1) Bump your version in package.json (optional but recommended)
# 2) Commit and tag
git commit -am "chore: release v1.0.5"
git tag v1.0.5
git push origin v1.0.5

Or using npm to manage the version and tag:

npm version patch   # or minor/major
git push --follow-tags

Homebrew Tap

You can distribute oca-proxy via a personal Homebrew tap.

  1. Create a tap repo: your-user/homebrew-tap
  2. Add a formula at Formula/oca-proxy.rb (a template exists in this repo under Formula/oca-proxy.rb)
  3. After a release publishes, update the sha256 values in the formula for each asset:
  • shasum -a 256 oca-proxy-macos-x64.tar.gz
  • shasum -a 256 oca-proxy-linux-x64.tar.gz
  • shasum -a 256 oca-proxy-macos-arm64.tar.gz (if you publish it)
  1. Commit the formula to your tap

Install from your tap:

brew tap your-user/tap
brew install oca-proxy

Automate tap updates

This repo includes .github/workflows/brew-tap.yml which can automatically bump your tap’s formula on every GitHub Release. Requirements:

  • Create GH_PAT secret (Personal Access Token with repo scope) in this repo
  • Ensure your tap repo is your-user/homebrew-tap (or adjust the workflow’s tap: input)

The action computes new checksums and updates URLs in Formula/oca-proxy.rb within your tap repository.