npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@crawlbase/mcp

v1.2.0

Published

MCP server for Crawlbase API - enables web scraping through Model Context Protocol

Downloads

103

Readme

What is Crawlbase MCP?

Crawlbase MCP is a Model Context Protocol (MCP) server that bridges AI agents and the live web. Instead of relying on outdated training data, your LLMs can now fetch fresh, structured, real-time content — powered by Crawlbase’s proven crawling infrastructure trusted by 70,000+ developers worldwide.

It handles the complexity of scraping for you:

  • JavaScript rendering for modern web apps
  • Proxy rotation & anti-bot evasion
  • Structured outputs (HTML, Markdown, screenshots)

How It Works

  • Get Free Crawlbase Tokens → Sign up at Crawlbase ↗️, get free Normal, and JavaScript tokens.
  • Set Up MCP Configuration → Configure the MCP server in your preferred client (Claude, Cursor, or Windsurf) by updating the MCP Servers settings.
  • Start Crawling → Use commands like crawl, crawl_markdown, or crawl_screenshot to bring live web data into your AI agent.

Setup & Integration

Claude Desktop

  1. Open Claude Desktop → Settings → Developer → Edit Config
  2. Add to claude_desktop_config.json:
  3. Replace your_token_here and your_js_token_here with the tokens from your dashboard.
{
  "mcpServers": {
    "crawlbase": {
      "type": "stdio",
      "command": "npx",
      "args": ["@crawlbase/mcp@latest"],
      "env": {
        "CRAWLBASE_TOKEN": "your_token_here",
        "CRAWLBASE_JS_TOKEN": "your_js_token_here"
      }
    }
  }
}

Claude Code

Add to your claude.json configuration:

{
  "mcpServers": {
    "crawlbase": {
      "type": "stdio",
      "command": "npx",
      "args": ["@crawlbase/mcp@latest"],
      "env": {
        "CRAWLBASE_TOKEN": "your_token_here",
        "CRAWLBASE_JS_TOKEN": "your_js_token_here"
      }
    }
  }
}

Cursor IDE

  1. Open Cursor IDE → File → Preferences → Cursor Settings → Tools and Integrations → Add Custom MCP
  2. Add to mcp.json:
  3. Replace your_token_here and your_js_token_here with the tokens from your dashboard.
{
  "mcpServers": {
    "crawlbase": {
      "type": "stdio",
      "command": "npx",
      "args": ["@crawlbase/mcp@latest"],
      "env": {
        "CRAWLBASE_TOKEN": "your_token_here",
        "CRAWLBASE_JS_TOKEN": "your_js_token_here"
      }
    }
  }
}

Windsurf IDE

  1. Open WindSurf IDE → File → Preferences → WindSurf Settings → General → MCP Servers → Manage MCPs → View raw config
  2. Add to mcp_config.json:
  3. Replace your_token_here and your_js_token_here with the tokens from your dashboard.
{
  "mcpServers": {
    "crawlbase": {
      "type": "stdio",
      "command": "npx",
      "args": ["@crawlbase/mcp@latest"],
      "env": {
        "CRAWLBASE_TOKEN": "your_token_here",
        "CRAWLBASE_JS_TOKEN": "your_js_token_here"
      }
    }
  }
}

HTTP Transport Mode

For scenarios where you need a shared MCP server accessible over HTTP (e.g., multi-user environments, custom integrations), you can run the server in HTTP mode:

# Clone and install
git clone https://github.com/crawlbase/crawlbase-mcp.git
cd crawlbase-mcp
npm install

# Start HTTP server with tokens (default port: 3000)
CRAWLBASE_TOKEN=your_token CRAWLBASE_JS_TOKEN=your_js_token npm run start:http

# Or with custom port
CRAWLBASE_TOKEN=your_token CRAWLBASE_JS_TOKEN=your_js_token MCP_PORT=8080 npm run start:http

The server exposes:

  • POST /mcp - MCP Streamable HTTP endpoint
  • GET /health - Health check endpoint

Per-Request Token Authentication

HTTP mode supports per-request tokens via headers, allowing multiple users to share a single server:

curl -X POST http://localhost:3000/mcp \
  -H "Content-Type: application/json" \
  -H "Accept: application/json, text/event-stream" \
  -H "X-Crawlbase-Token: your_token" \
  -H "X-Crawlbase-JS-Token: your_js_token" \
  -d '{"jsonrpc": "2.0", "method": "tools/list", "id": 1}'

Headers:

  • X-Crawlbase-Token - Normal token for HTML requests
  • X-Crawlbase-JS-Token - JavaScript token for JS-rendered pages/screenshots

Headers override environment variables when provided, enabling multi-tenant deployments.

🔑 Get your free tokens at Crawlbase ↗️.

Usage

Once configured, use these commands inside Claude, Cursor, or Windsurf:

  • crawl → Fetch raw HTML
  • crawl_markdown → Extract clean Markdown
  • crawl_screenshot → Capture screenshots

Example prompts:

  • “Crawl Hacker News and return top stories in markdown.”
  • “Take a screenshot of TechCrunch homepage.”
  • “Fetch Tesla investor relations page as HTML.”

Use Cases

  • Market research → Pull live data from competitors, news, and reports
  • E-commerce monitoring → Track products, reviews, and prices in real time
  • News & finance feeds → Keep AI agents up-to-date with live events
  • Autonomous AI agents → Give them vision to act on fresh web data

Resources & Next Steps

Looking to supercharge your AI agents with live web data? Get started here:


MSeeP.ai Security Assessment Badge

Copyright 2025 Crawlbase