npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

ownrig-mcp

v1.0.0

Published

OwnRig MCP Server — AI hardware compatibility data for Claude, ChatGPT, Cursor, and any MCP-compatible assistant. 50 models, 25 devices, 9 machines, 663 compatibility entries.

Readme

OwnRig MCP Server

AI hardware compatibility data for any MCP-compatible assistant. Query 50 models, 25 devices, 14 builds, 9 ready-to-buy machines, and 663 compatibility entries.

Transport: stdio

Install

npm install -g ownrig-mcp

Or use directly with npx:

npx ownrig-mcp

Tools

| Tool | Description | |------|-------------| | query_model | Get details for a specific AI model (VRAM, formats, use cases) | | query_device | Get specs for a GPU or Apple Silicon device | | query_compatibility | Check if a model runs on a device (tokens/sec, VRAM fit) | | list_models | List models with optional use_case / family filter | | list_devices | List devices with optional type / min_vram filter | | list_builds | List curated builds with optional tier / profile filter | | list_systems | List ready-to-buy machines (Mac, Dell, ASUS) with optional brand / type filter | | query_system | Get full details for a specific ready-to-buy machine | | recommend_build | Full recommendation engine — 3 paths (model→hw, workflow→hw, hw→models) | | find_models_for_device | "What can I run on my RTX 4090?" | | find_devices_for_model | "What GPU do I need for Llama 3.1 70B?" | | list_workflows | List workflow profiles (tools → hardware requirements) |

Usage with Claude Desktop

Add to ~/Library/Application Support/Claude/claude_desktop_config.json:

{
  "mcpServers": {
    "ownrig": {
      "command": "npx",
      "args": ["-y", "ownrig-mcp"]
    }
  }
}

Usage with Cursor

Add to .cursor/mcp.json in your project:

{
  "mcpServers": {
    "ownrig": {
      "command": "npx",
      "args": ["-y", "ownrig-mcp"]
    }
  }
}

Usage from source (development)

If you have the OwnRig repo cloned:

# From project root
npm install
npm run generate:rec-data
npm run mcp

The mcp script builds a self-contained bundle via esbuild (resolving all @/ path aliases) then runs it. Running tsx mcp-server/index.ts directly does not work because the engine uses TypeScript path aliases that tsx cannot resolve transitively across module boundaries.

For your MCP client config, point to the built bundle:

{
  "mcpServers": {
    "ownrig": {
      "command": "node",
      "args": ["mcp-server/dist/index.mjs"],
      "cwd": "/path/to/ownrig"
    }
  }
}

Example queries

Once connected, ask your AI assistant:

  • "What GPU do I need to run Llama 3.1 70B locally?"
  • "Can an RTX 4090 run Qwen 3 32B?"
  • "Recommend a build for running AI coding tools with Cursor"
  • "What models can I run on my M4 Max MacBook Pro?"
  • "Compare the Mac Studio M4 Ultra vs a custom build for AI"

Data

This package includes a snapshot of OwnRig's verified hardware compatibility data. The data is updated with each package release.

  • 50 AI models with VRAM requirements per quantization level
  • 25 GPUs and Apple Silicon devices with specs and pricing
  • 14 curated PC builds with component lists and benchmarks
  • 9 ready-to-buy machines (Mac, Dell, ASUS, NVIDIA)
  • 663 model × device × quantization compatibility entries
  • 7 workflow profiles mapping AI tools to hardware needs

Source: ownrig.com | License: MIT