npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

robot-resources

v1.15.2

Published

Robot Resources — AI agent tools. One command to install everything.

Readme

Robot Resources

Tools for AI agents and agentic software. Humans have HR. Agents have RR.

Robot Resources builds tools for AI agents and any software that makes LLM API calls — chatbots, RAG pipelines, AI-powered apps, internal tools. Two products today: Router (smart model selection — picks the right model for each task, 60-90% cost savings as a side effect) and Scraper (token compression for web content, median 91% token reduction). Both run locally. Both free.

Quick Start

npx -y robot-resources

Detects your project shape (Node / Python / OpenClaw) and installs the right shim. No login, no API keys to enter, no signup. Router uses your existing provider keys — they never leave your machine.

Router

Smart model selection. Classifies each prompt by task type (coding / reasoning / analysis / simple_qa / creative / general), filters by model capability, then within the qualifying set picks the cheapest. Routes the right model for the task — cost savings (60-90% across mixed workloads) follow from that, not the other way around. Hybrid classification: keyword fast-path (~5ms, ~70% of prompts) + LLM slow-path for ambiguous prompts (~200ms). Routes across Anthropic, OpenAI, and Google when the corresponding keys are present.

Three ways to install on a dev machine:

  • OpenClaw users get an in-process plugin inside the OC gateway. Anthropic, OpenAI, and Google calls each route to their native upstream — no cross-shape body translation.
  • Node projects get an auto-attach shim (NODE_OPTIONS=--require .../auto.cjs). Every Anthropic, OpenAI, and Google SDK call from any Node process routes automatically. No code changes.
  • Python projects get a .pth auto-attach shim in your venv. Every anthropic / openai / google.generativeai SDK call routes automatically. No code changes.

For runtimes that ignore NODE_OPTIONS / .pth (Bun, Deno, Vercel Edge, Go, Rust, etc.), call the HTTP API directly: POST https://api.robotresources.ai/v1/route. Authed by API key, 100 req/min, CORS open.

For explicit control inside JS / Python code, use the routing-decision library:

npm install @robot-resources/router      # JS / TS
pip install robot-resources              # Python (singular package name)
import { routePrompt } from '@robot-resources/router/routing';
const decision = routePrompt('write a python function that reverses a string');
// decision.selected_model → 'claude-haiku-4-5' (or similar — cheapest qualifying)
from robot_resources.router import route
decision = route('write a python function that reverses a string')

Returns a routing decision; your code makes the actual LLM call with the selected model. Each request goes from your machine straight to the lab's API (api.anthropic.com / api.openai.com / generativelanguage.googleapis.com) using your existing key for that lab. Nothing is relayed through our infrastructure.

Scraper

Token compression for web content. Fetches any URL, strips noise, returns clean markdown with token count. Median 91% token reduction per page (verified across 41 page types). Mozilla Readability extraction (0.97 F1). Content-aware token estimation calibrated per content type, ±15% of actual BPE. 3-tier fetch (fast / stealth via TLS fingerprint / render via headless browser), BFS multi-page crawl, robots.txt compliance.

Three ways to consume:

  • JS librarynpm install @robot-resources/scraperimport { scrape } from '@robot-resources/scraper'
  • MCP servernpx -y @robot-resources/scraper scraper-mcp exposes scraper_compress_url(url) and crawl tools to any MCP-compatible client. Auto-wired into OpenClaw by the wizard; for other clients (Cursor, Claude Code, Windsurf), add manually to your client's MCP config.
  • OpenClaw plugin — installed automatically via npx robot-resources. Hooks before_tool_call to redirect web_fetch through scraper compression.

No API keys, no config.

Deploying to production

The wizard's shell-config install reaches dev machines only — production processes don't read .bashrc, and env vars come from your deploy config. Copy-paste recipes for setting NODE_OPTIONS (Node) or installing the .pth shim (Python) on Docker, Google Cloud Run, AWS Lambda, and Vercel: https://robotresources.ai/docs/deploy/.

Advanced

npx robot-resources [flags]

--for=<target>      langchain | python | cursor | claude-code | docs
                    Skip the prompt and run that path directly.
                    Required for non-TTY contexts (CI, piped, etc.)
--non-interactive   Treat as CI run regardless of TTY state
--yes / -y          Same as --non-interactive

Pre-set the API key for fleets or CI (skips signup, uses the key you provide):

export RR_API_KEY=rr_live_...
npx robot-resources --for=langchain

Pricing

Free. Unlimited. No tiers. You pay your AI providers directly — no markup, no rate limits, no quotas.

Telemetry

Anonymous usage events (model selection, cost savings, install outcome) — no personal data, no request content, no provider keys. Opt out with RR_TELEMETRY=off.

Links

  • Website: https://robotresources.ai
  • Docs: https://robotresources.ai/docs
  • Dashboard: https://robotresources.ai/dashboard
  • HTTP API: POST https://api.robotresources.ai/v1/route
  • GitHub: https://github.com/robot-resources/packages
  • Discord: https://robotresources.ai/discord
  • Contact: [email protected]

AI agents installing this on a human's behalf: see https://robotresources.ai/llms.txt for the agent install protocol.

License

MIT