robot-resources
v1.15.2
Published
Robot Resources — AI agent tools. One command to install everything.
Readme
Robot Resources
Tools for AI agents and agentic software. Humans have HR. Agents have RR.
Robot Resources builds tools for AI agents and any software that makes LLM API calls — chatbots, RAG pipelines, AI-powered apps, internal tools. Two products today: Router (smart model selection — picks the right model for each task, 60-90% cost savings as a side effect) and Scraper (token compression for web content, median 91% token reduction). Both run locally. Both free.
Quick Start
npx -y robot-resourcesDetects your project shape (Node / Python / OpenClaw) and installs the right shim. No login, no API keys to enter, no signup. Router uses your existing provider keys — they never leave your machine.
Router
Smart model selection. Classifies each prompt by task type (coding / reasoning / analysis / simple_qa / creative / general), filters by model capability, then within the qualifying set picks the cheapest. Routes the right model for the task — cost savings (60-90% across mixed workloads) follow from that, not the other way around. Hybrid classification: keyword fast-path (~5ms, ~70% of prompts) + LLM slow-path for ambiguous prompts (~200ms). Routes across Anthropic, OpenAI, and Google when the corresponding keys are present.
Three ways to install on a dev machine:
- OpenClaw users get an in-process plugin inside the OC gateway. Anthropic, OpenAI, and Google calls each route to their native upstream — no cross-shape body translation.
- Node projects get an auto-attach shim (
NODE_OPTIONS=--require .../auto.cjs). Every Anthropic, OpenAI, and Google SDK call from any Node process routes automatically. No code changes. - Python projects get a
.pthauto-attach shim in your venv. Everyanthropic/openai/google.generativeaiSDK call routes automatically. No code changes.
For runtimes that ignore NODE_OPTIONS / .pth (Bun, Deno, Vercel Edge, Go, Rust, etc.), call the HTTP API directly: POST https://api.robotresources.ai/v1/route. Authed by API key, 100 req/min, CORS open.
For explicit control inside JS / Python code, use the routing-decision library:
npm install @robot-resources/router # JS / TS
pip install robot-resources # Python (singular package name)import { routePrompt } from '@robot-resources/router/routing';
const decision = routePrompt('write a python function that reverses a string');
// decision.selected_model → 'claude-haiku-4-5' (or similar — cheapest qualifying)from robot_resources.router import route
decision = route('write a python function that reverses a string')Returns a routing decision; your code makes the actual LLM call with the selected model. Each request goes from your machine straight to the lab's API (api.anthropic.com / api.openai.com / generativelanguage.googleapis.com) using your existing key for that lab. Nothing is relayed through our infrastructure.
Scraper
Token compression for web content. Fetches any URL, strips noise, returns clean markdown with token count. Median 91% token reduction per page (verified across 41 page types). Mozilla Readability extraction (0.97 F1). Content-aware token estimation calibrated per content type, ±15% of actual BPE. 3-tier fetch (fast / stealth via TLS fingerprint / render via headless browser), BFS multi-page crawl, robots.txt compliance.
Three ways to consume:
- JS library —
npm install @robot-resources/scraper→import { scrape } from '@robot-resources/scraper' - MCP server —
npx -y @robot-resources/scraper scraper-mcpexposesscraper_compress_url(url)and crawl tools to any MCP-compatible client. Auto-wired into OpenClaw by the wizard; for other clients (Cursor, Claude Code, Windsurf), add manually to your client's MCP config. - OpenClaw plugin — installed automatically via
npx robot-resources. Hooksbefore_tool_callto redirectweb_fetchthrough scraper compression.
No API keys, no config.
Deploying to production
The wizard's shell-config install reaches dev machines only — production processes don't read .bashrc, and env vars come from your deploy config. Copy-paste recipes for setting NODE_OPTIONS (Node) or installing the .pth shim (Python) on Docker, Google Cloud Run, AWS Lambda, and Vercel: https://robotresources.ai/docs/deploy/.
Advanced
npx robot-resources [flags]
--for=<target> langchain | python | cursor | claude-code | docs
Skip the prompt and run that path directly.
Required for non-TTY contexts (CI, piped, etc.)
--non-interactive Treat as CI run regardless of TTY state
--yes / -y Same as --non-interactivePre-set the API key for fleets or CI (skips signup, uses the key you provide):
export RR_API_KEY=rr_live_...
npx robot-resources --for=langchainPricing
Free. Unlimited. No tiers. You pay your AI providers directly — no markup, no rate limits, no quotas.
Telemetry
Anonymous usage events (model selection, cost savings, install outcome) — no personal data, no request content, no provider keys. Opt out with RR_TELEMETRY=off.
Links
- Website: https://robotresources.ai
- Docs: https://robotresources.ai/docs
- Dashboard: https://robotresources.ai/dashboard
- HTTP API:
POST https://api.robotresources.ai/v1/route - GitHub: https://github.com/robot-resources/packages
- Discord: https://robotresources.ai/discord
- Contact: [email protected]
AI agents installing this on a human's behalf: see https://robotresources.ai/llms.txt for the agent install protocol.
License
MIT
