npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

scrapfly-mcp

v1.0.9

Published

Scrape any website, extract structured data, and collect web content at scale with AI agents

Readme

Scrapfly MCP Server


What is Scrapfly MCP?

The Scrapfly MCP Server connects your AI assistants to live web data through the Model Context Protocol. Transform your AI from being limited by training data to having real-time access to any website.

✨ What Your AI Can Do

| Capability | Description | |------------|-------------| | 🌐 Scrape Live Data | Pull current prices, listings, news, or any webpage content in real-time | | 🛡️ Bypass Anti-Bot Systems | Automatically handle CAPTCHAs, proxies, JavaScript rendering, and rate limits | | ⚡ Extract Structured Data | Parse complex websites into clean JSON using AI-powered extraction | | 📸 Capture Screenshots | Take visual snapshots of pages or specific elements for analysis |

🏆 Why Scrapfly?

Built on battle-tested infrastructure used by thousands of developers:

📖 Learn more: Why Scrapfly MCP?


🚀 Quick Install

Click one of the buttons below to install the MCP server in your preferred IDE:

Install in VS Code Install in VS Code Insiders Install in Visual Studio Install in Cursor


📦 Manual Installation

Standard Configuration

Works with most MCP-compatible tools:

{
  "servers": {
    "scrapfly-cloud-mcp": {
      "type": "http",
      "url": "https://mcp.scrapfly.io/mcp"
    }
  }
}

Cloud Configuration (NPX)

For tools that require a local process:

{
  "mcpServers": {
    "scrapfly": {
      "command": "npx",
      "args": [
        "mcp-remote",
        "https://mcp.scrapfly.io/mcp"
      ]
    }
  }
}

🔧 IDE-Specific Setup

One-Click Install

Install in VS Code

Manual Install

Follow the VS Code MCP guide or use the CLI:

code --add-mcp '{"name":"scrapfly-cloud-mcp","type":"http","url":"https://mcp.scrapfly.io/mcp"}'

After installation, Scrapfly tools will be available in GitHub Copilot Chat.

📖 Full guide: VS Code Integration

One-Click Install

Install in VS Code Insiders

Manual Install

code-insiders --add-mcp '{"name":"scrapfly-cloud-mcp","type":"http","url":"https://mcp.scrapfly.io/mcp"}'

📖 Full guide: VS Code Integration

One-Click Install

Install in Visual Studio

Manual Install

  1. Open Visual Studio
  2. Navigate to GitHub Copilot Chat window
  3. Click the tools icon (🛠️) in the chat toolbar
  4. Click + Add Server to open the configuration dialog
  5. Configure:
    • Server ID: scrapfly-cloud-mcp
    • Type: http/sse
    • URL: https://mcp.scrapfly.io/mcp
  6. Click Save

📖 Full guide: Visual Studio MCP documentation

One-Click Install

Install in Cursor

Manual Install

  1. Go to Cursor SettingsMCPAdd new MCP Server
  2. Use the standard configuration above
  3. Click Edit to verify or add arguments

📖 Full guide: Cursor Integration

Use the Claude Code CLI:

claude mcp add scrapfly-cloud-mcp --url https://mcp.scrapfly.io/mcp

📖 Full guide: Claude Code Integration

Add to your Claude Desktop configuration file:

macOS: ~/Library/Application Support/Claude/claude_desktop_config.json Windows: %APPDATA%\Claude\claude_desktop_config.json

{
  "mcpServers": {
    "scrapfly": {
      "command": "npx",
      "args": ["mcp-remote", "https://mcp.scrapfly.io/mcp"]
    }
  }
}

📖 Full guide: Claude Desktop Integration

Add to your Cline MCP settings:

{
  "scrapfly-cloud-mcp": {
    "type": "http",
    "url": "https://mcp.scrapfly.io/mcp"
  }
}

📖 Full guide: Cline Integration

Follow the Windsurf MCP documentation using the standard configuration.

📖 Full guide: Windsurf Integration

Add to your Zed settings:

{
  "context_servers": {
    "scrapfly-cloud-mcp": {
      "type": "http",
      "url": "https://mcp.scrapfly.io/mcp"
    }
  }
}

📖 Full guide: Zed Integration

Create or edit ~/.codex/config.toml:

[mcp_servers.scrapfly-cloud-mcp]
url = "https://mcp.scrapfly.io/mcp"

📖 More info: Codex MCP documentation

Follow the Gemini CLI MCP guide using the standard configuration.

Add to ~/.config/opencode/opencode.json:

{
  "$schema": "https://opencode.ai/config.json",
  "mcp": {
    "scrapfly-cloud-mcp": {
      "type": "http",
      "url": "https://mcp.scrapfly.io/mcp",
      "enabled": true
    }
  }
}

📖 More info: OpenCode MCP documentation


🛠️ Available Tools

The Scrapfly MCP Server provides 5 powerful tools covering 99% of web scraping use cases:

| Tool | Description | Use Case | |------|-------------|----------| | scraping_instruction_enhanced | Get best practices & POW token | Always call first! | | web_get_page | Quick page fetch with smart defaults | Simple scraping tasks | | web_scrape | Full control with browser automation | Complex scraping, login flows | | screenshot | Capture page screenshots | Visual analysis, monitoring | | info_account | Check usage & quota | Account management |

📖 Full reference: Tools & API Specification

Example: Scrape a Page

User: "What are the top posts on Hacker News right now?"

AI: Uses web_get_page to fetch https://news.ycombinator.com and returns current top stories

Example: Extract Structured Data

User: "Get all product prices from this Amazon page"

AI: Uses web_scrape with extraction_model="product_listing" to return structured JSON

📖 More examples: Real-World Examples


🔐 Authentication

Scrapfly MCP supports multiple authentication methods:

| Method | Best For | Documentation | |--------|----------|---------------| | OAuth2 | Production, multi-user apps | OAuth2 Setup | | API Key | Personal use, development | API Key Setup | | Header Auth | Custom integrations | Header Auth |

🔑 Get your API key: Scrapfly Dashboard


📊 Configuration Reference

| Setting | Value | |---------|-------| | Server Name | scrapfly-cloud-mcp | | Type | Remote HTTP Server | | URL | https://mcp.scrapfly.io/mcp | | Protocol | MCP over HTTP/SSE |


🖥️ Self-Hosted / Local Deployment

You can run the Scrapfly MCP server locally or self-host it.

CLI Arguments

| Flag | Description | |------|-------------| | -http <address> | Start HTTP server at the specified address (e.g., :8080). Takes precedence over PORT env var. | | -apikey <key> | Use this API key instead of the SCRAPFLY_API_KEY environment variable. |

Environment Variables

| Variable | Description | |----------|-------------| | PORT | HTTP port to listen on. Used if -http flag is not set. | | SCRAPFLY_API_KEY | Default Scrapfly API key. Can also be passed via query parameter ?apiKey=xxx at runtime. |

Examples

# Start HTTP server on port 8080
./scrapfly-mcp -http :8080

# Start HTTP server using PORT env var
PORT=8080 ./scrapfly-mcp

# Start with API key
./scrapfly-mcp -http :8080 -apikey scp-live-xxxx

# Start in stdio mode (for local MCP clients)
./scrapfly-mcp

Docker

# Build
docker build -t scrapfly-mcp .

# Run (Smithery compatible - uses PORT env var)
docker run -p 8080:8080 scrapfly-mcp

# Run with custom port
docker run -e PORT=9000 -p 9000:9000 scrapfly-mcp

🤝 Framework Integrations

Scrapfly MCP also works with AI frameworks and automation tools:

| Framework | Documentation | |-----------|---------------| | LangChain | LangChain Integration | | LlamaIndex | LlamaIndex Integration | | CrewAI | CrewAI Integration | | OpenAI | OpenAI Integration | | n8n | n8n Integration | | Make | Make Integration | | Zapier | Zapier Integration |

📖 All integrations: Integration Index


📚 Resources


💬 Need Help?