npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@hyperscalesdkdev/sdk

v1.0.0

Published

Hyperscale SDK — client library wrapping OpenRouter inference API

Readme

Hyperscale SDK

Client library for the Hyperscale API — a branded interface over OpenRouter inference. Use 400+ AI models (OpenAI, Anthropic, Google, Meta, Mistral, etc.) through a single integration with Hyperscale authentication, billing, and routing.

Install

npm install @hyperscale/sdk

Quick start

import { HyperscaleClient } from "@hyperscale/sdk";

const client = new HyperscaleClient({
  apiKey: process.env.HYPERSCALE_API_KEY!,
  baseURL: "https://api.hyperscale.ai/v1",
});

// Non-streaming
const response = await client.chat.completions.create({
  model: "anthropic/claude-3-5-sonnet",
  messages: [{ role: "user", content: "Hello!" }],
  max_tokens: 1024,
});
console.log(response.choices[0].message.content);

// Streaming
const stream = await client.chat.completions.create({
  model: "openai/gpt-4o",
  messages: [{ role: "user", content: "Tell me a story." }],
  stream: true,
  max_tokens: 1024,
});
for await (const chunk of stream) {
  process.stdout.write(chunk.choices[0]?.delta?.content ?? "");
}

Features

  • Chat completions — streaming and non-streaming, OpenAI-compatible request/response
  • Model discoverymodels.list() with optional category filter and ~5 min cache
  • Usage & costusage on completion responses; generations.get(id) for post-hoc stats; keys.getCredits() for balance
  • Errors — typed exceptions (ValidationError, AuthError, InsufficientCreditsError, RateLimitError, UpstreamError) with retry for 429/5xx
  • OpenRouter model variants — use suffixes in model: :free, :nitro, :floor, :online, :thinking

API overview

| Method | Description | |--------|-------------| | client.chat.completions.create(params) | Chat completion (use stream: true for streaming) | | client.models.list({ category? }) | List models (cached) | | client.models.endpoints(author, slug) | Providers for a model | | client.generations.get(id) | Token/cost stats for a generation | | client.keys.getKeyInfo() | Key rate limit and usage | | client.keys.getCredits() | Account credit balance |

Testing

1. Unit tests (no API key needed)

From the repo root:

npm install
npm test

This runs Vitest on tests/ (SSE parsing, error mapping). Use npm run test:watch for watch mode.

2. Live API test

After building, you can hit a real API to verify the SDK end-to-end.

  1. Build the SDK

    npm run build
  2. Set your API key (and optional base URL)

    • Hyperscale gateway:
      HYPERSCALE_API_KEY=your-hyperscale-key
      Optionally HYPERSCALE_BASE_URL=https://api.hyperscale.ai/v1
    • OpenRouter directly (e.g. for dev):
      HYPERSCALE_API_KEY=your-openrouter-key
      HYPERSCALE_BASE_URL=https://openrouter.ai/api/v1

    Windows (PowerShell):

    $env:HYPERSCALE_API_KEY="your-key"
    $env:HYPERSCALE_BASE_URL="https://openrouter.ai/api/v1"   # optional

    Windows (CMD):

    set HYPERSCALE_API_KEY=your-key

    Linux / macOS:

    export HYPERSCALE_API_KEY=your-key
    export HYPERSCALE_BASE_URL=https://openrouter.ai/api/v1   # optional
  3. Run the example

    node examples/quick-test.mjs

    It runs a non-streaming and a streaming chat call (using a free model) and lists a few models. If any step fails, it prints the error and exits with code 1.

Error handling

import {
  HyperscaleClient,
  AuthError,
  InsufficientCreditsError,
  RateLimitError,
} from "@hyperscale/sdk";

try {
  const r = await client.chat.completions.create({ ... });
} catch (e) {
  if (e instanceof AuthError) { /* invalid key */ }
  if (e instanceof InsufficientCreditsError) { /* add credits */ }
  if (e instanceof RateLimitError) { /* back off; SDK retries by default */ }
}

License

MIT