npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@gpu-bridge/mcp-server

v2.4.4

Published

GPU-Bridge MCP Server — 30 AI services as MCP tools. LLM, image, video, audio, embeddings, reranking, PDF parsing, NSFW detection & more. x402 native for autonomous agents.

Readme

GPU-Bridge MCP Server

30 GPU-powered AI services as MCP tools — LLMs, image generation, audio, video, embeddings, reranking, PDF parsing, NSFW detection & more. x402 native for autonomous AI agents: pay per request on-chain with USDC on Base L2. No API keys. No accounts.

npm version License: MIT MCP Compatible

What is GPU-Bridge?

GPU-Bridge is a unified GPU inference API with native x402 support — the open payment protocol that allows AI agents to autonomously pay for compute with USDC on Base L2. No API keys, no accounts, no human intervention required.

This MCP server exposes all 30 GPU-Bridge services as Model Context Protocol tools, giving Claude (and any MCP-compatible AI) direct access to GPU inference.


Install in Claude Desktop (2 minutes)

1. Get your API key (or use x402 for autonomous agents)

Visit gpubridge.io and grab a free API key, or use the x402 protocol for keyless agent payments.

2. Add to claude_desktop_config.json

macOS: ~/Library/Application Support/Claude/claude_desktop_config.json Windows: %APPDATA%\Claude\claude_desktop_config.json

{
  "mcpServers": {
    "gpu-bridge": {
      "command": "npx",
      "args": ["-y", "@gpu-bridge/mcp-server"],
      "env": {
        "GPUBRIDGE_API_KEY": "your_api_key_here"
      }
    }
  }
}

3. Restart Claude Desktop

That's it. Claude now has access to 30 GPU-powered AI services.


MCP Tools

gpu_run

Run any GPU-Bridge service. The primary tool for executing AI tasks.

Parameters:
  service  (string)  — Service key (e.g., "llm-4090", "flux-schnell", "whisper-l4")
  input    (object)  — Service-specific input parameters
  priority (string)  — Optional: "fast" (lowest latency) or "cheap" (lowest cost)

gpu_catalog

Get the full catalog of available services with pricing and capabilities.

gpu_estimate

Estimate cost before running a service. No authentication required.

gpu_status

Check the status of a job and retrieve results.

gpu_balance

Check your current balance, daily spend, and volume discount tier.


30 Available Services

Language Models (LLMs)

| Service ID | Description | Notes | |-----------|-------------|-------| | llm-4090 | General purpose LLM | Sub-second via Groq | | llm-a100 | Maximum capability LLM | Largest models | | llm-l4 | Ultra-fast, low cost LLM | Budget option | | code-4090 | Code generation | Optimized for code | | llm-stream | Streaming LLM responses | Real-time output |

Image Generation

| Service ID | Description | Notes | |-----------|-------------|-------| | flux-schnell | FLUX.1 Schnell | Fast, 4-step generation | | flux-dev | FLUX.1 Dev | High quality | | sdxl-4090 | Stable Diffusion XL | Versatile | | sd35-l4 | Stable Diffusion 3.5 | Latest SD model | | img2img-4090 | Image-to-image | Style transfer, editing |

Vision & Image Analysis

| Service ID | Description | Notes | |-----------|-------------|-------| | llava-4090 | Visual Q&A | Image understanding | | ocr-l4 | Text extraction (OCR) | Multi-language | | rembg-l4 | Background removal | Instant | | caption-4090 | Image captioning | Auto-describe images | | nsfw-detect | Content moderation | NSFW classification |

Speech-to-Text

| Service ID | Description | Notes | |-----------|-------------|-------| | whisper-l4 | Fast transcription | Sub-second | | whisper-a100 | High accuracy transcription | Large files | | diarize-l4 | Speaker diarization | Who said what |

Text-to-Speech

| Service ID | Description | Notes | |-----------|-------------|-------| | tts-l4 | Voice cloning TTS | 40+ voices | | tts-fast | Ultra-fast TTS | Lowest latency | | bark-4090 | Expressive TTS | Emotion, laughter |

Audio Generation

| Service ID | Description | Notes | |-----------|-------------|-------| | musicgen-l4 | Music generation | Text-to-music | | audiogen-l4 | Sound effects | Text-to-SFX |

Embeddings & Search

| Service ID | Description | Notes | |-----------|-------------|-------| | embed-l4 | Text embeddings | Multilingual | | embed-code | Code embeddings | For code search | | rerank | Document reranking | Jina, sub-second |

Video

| Service ID | Description | Notes | |-----------|-------------|-------| | animatediff | Text-to-video | AnimateDiff | | video-enhance | Video upscaling | Up to 4K |

Utilities

| Service ID | Description | Notes | |-----------|-------------|-------| | pdf-parse | Document parsing | PDF/DOCX to text |


x402: For Autonomous AI Agents

GPU-Bridge supports the x402 payment protocol, enabling truly autonomous AI agents to pay for compute without human intervention.

Agent Request → GPU-Bridge returns HTTP 402 Payment Required
      ↓
Agent pays USDC on Base L2 (gas < $0.01, settles in 2s)
      ↓
Agent retries with payment proof → GPU-Bridge executes and returns result

Python Example with x402

from x402.client import PaymentClient

client = PaymentClient(private_key="0x...", chain="base")

response = client.request(
    "POST",
    "https://api.gpubridge.io/v1/run",
    json={
        "service": "flux-schnell",
        "input": {"prompt": "A robot painting on a canvas", "steps": 4}
    }
)
print(response.json())

Pricing

| Category | Starting From | |----------|--------------| | LLMs | $0.003/1K tokens | | Image Generation | $0.01/image | | Speech-to-Text | $0.005/minute | | Text-to-Speech | $0.005/1K chars | | Embeddings | $0.0001/1K tokens | | Reranking | $0.001/query | | PDF Parsing | $0.005/document |

All prices in USD. x402 payments in USDC on Base L2.


Links


License

MIT © Healthtech Capital LLC