npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

searxng-crawl4ai-mcp

v1.0.0

Published

Self-hosted MCP server for web search (SearXNG) and scraping (Crawl4AI)

Readme

SearXNG + Crawl4AI MCP Server

A self-hosted Model Context Protocol (MCP) server that provides powerful web search and scraping capabilities through SearXNG (metasearch engine) and Crawl4AI (advanced web scraper).

Features

  • 🔍 Web Search: Aggregate results from 70+ search engines via SearXNG
  • 🕷️ Web Scraping: Advanced scraping with Playwright browser automation
  • 🚀 Fast Performance: 3x faster than native Claude Code tools
  • 🏠 Fully Self-Hosted: No API limits, rate limits, or external dependencies
  • 🔄 Flexible Deployment: Run locally or connect to remote server
  • 📦 Easy Installation: NPM package with simple configuration

Quick Start

Prerequisites

  • Node.js 18+
  • Docker and Docker Compose
  • Claude Code or any MCP-compatible client

Installation

Option 1: Local Development

# Clone repository
git clone https://github.com/yourusername/searxng-crawl4ai-mcp.git
cd searxng-crawl4ai-mcp

# Install dependencies
npm install

# Start Docker services
docker compose up -d

# Services will be available at:
# - SearXNG: http://localhost:8081
# - Crawl4AI: http://localhost:8001

Option 2: NPM Package (Coming Soon)

# Install globally
npm install -g searxng-crawl4ai-mcp

# Or use with npx
npx searxng-crawl4ai-mcp

Configure Claude Code

Add to your .mcp.json:

{
  "mcpServers": {
    "searxng-crawl4ai": {
      "command": "node",
      "args": ["mcp-server.js"],
      "cwd": "/path/to/searxng-crawl4ai-mcp",
      "env": {
        "SEARXNG_BASE_URL": "http://localhost"
      }
    }
  }
}

For remote server setup, change SEARXNG_BASE_URL:

"env": {
  "SEARXNG_BASE_URL": "http://192.168.1.100"
}

Available Tools

search_web

Search the web using SearXNG metasearch engine.

{
  "query": "latest AI developments",
  "maxResults": 10
}

crawl4ai_scrape

Scrape content from a single URL.

{
  "url": "https://example.com",
  "formats": ["markdown", "html"]
}

search_and_scrape

Search and automatically scrape top results (most powerful).

{
  "query": "Bitcoin price analysis",
  "maxResults": 3
}

Environment Variables

Copy .env.example to .env:

# Base URL for services (default: http://localhost)
# For remote server: http://YOUR_SERVER_IP
SEARXNG_BASE_URL=http://localhost

# Optional: Proxy for scraping
PROXY_URL=http://username:[email protected]:port

Docker Services

# Start services
npm run docker:up

# Stop services
npm run docker:down

# View logs
npm run docker:logs

Performance Comparison

| Metric | Native Claude Tools | SearXNG+Crawl4AI MCP | |--------|-------------------|---------------------| | Search Speed | 2500-3000ms | 935ms | | Scraping Success | 0% (WebFetch hangs) | 100% | | Content Extracted | 0 words | 29,807 words | | API Limits | Yes | None |

Architecture

┌─────────────┐     ┌─────────────┐     ┌──────────────┐
│ Claude Code │────▶│ MCP Server  │────▶│ Docker Stack │
│   (Client)  │     │ (Node.js)   │     │  - SearXNG   │
└─────────────┘     └─────────────┘     │  - Crawl4AI  │
                                        │  - Redis     │
                                        └──────────────┘

Deployment Options

  1. Local: Everything runs on your machine
  2. Remote Server: Docker services on server, MCP client connects remotely
  3. Shared Infrastructure: One server supports multiple users

Testing

# Test MCP functionality
npm test

# Test search
node test-mcp-simple.js

# Test services directly
curl "http://localhost:8081/search?q=test&format=json"

Troubleshooting

MCP server not responding

  • Verify SEARXNG_BASE_URL is correct
  • Check services: docker compose ps
  • For remote: ensure ports 8081 and 8001 are accessible

Crawl4AI browser errors

  • Check logs: docker compose logs crawl4ai
  • Playwright should be auto-installed in container

Search returning no results

  • Verify SearXNG: curl http://localhost:8081/search?q=test&format=json
  • Check searxng-settings.yml configuration

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

MIT License - See LICENSE file for details.

Support

For issues or questions:

  • Open an issue on GitHub
  • Check Docker logs: docker compose logs [service]
  • Review CLAUDE.md for detailed documentation