npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@nickwatson/n8n-nodes-eddie-surf

v0.1.2

Published

n8n community nodes for Eddie Surf web crawling and search

Readme

n8n-nodes-eddie-surf

Transform websites into structured JSON data with AI-powered crawling - no brittle selectors, no complex setup.

Why Eddie.surf?

Eddie.surf uses Claude Sonnet 4 to intelligently extract data from websites, handling dynamic content, pagination, and complex layouts that break traditional scrapers. Perfect for n8n workflows that need reliable web data.

What makes it different:

  • Smart Crawling: AI understands page content and structure, not just HTML
  • Structured Output: Get clean JSON data matching your exact schema
  • Workflow-Ready: Built-in webhooks and polling for seamless n8n integration
  • Cost Effective: At $0.04-0.06 per page, often cheaper than building your own solution

Installation

npm install n8n-nodes-eddie-surf

Quick Start

  1. Get API Key: Sign up at eddie.surf and grab your API key
  2. Add Credentials: In n8n, create new "Eddie Surf API" credentials
  3. Start Crawling: Drop the Eddie.surf node into your workflow

What You Can Build

Lead Research Workflows

  • Crawl prospect websites → Extract contact info → Add to CRM
  • Monitor competitor pricing → Alert on changes → Update strategy

Market Intelligence

  • Batch crawl industry sites → Analyze trends → Generate reports
  • Track product mentions → Sentiment analysis → Dashboard updates

AI Agent Data Sources

  • Web research → Structured data → Feed to language models
  • Real-time search → Context gathering → Enhanced AI responses

Available Operations

  • Crawl: Extract data from 1-199 URLs with custom JSON schemas
  • Batch Crawl: Process 200+ URLs efficiently with reduced costs
  • Smart Search: AI-powered web search with structured results
  • Get Status: Monitor job progress and retrieve results

Example Workflows

Basic Web Crawling

{
  "urls": "https://example.com, https://another-site.com",
  "context": {"purpose": "Company research"},
  "json": {
    "company_name": {
      "type": "string", 
      "description": "Company name"
    },
    "email": {
      "type": "string",
      "description": "Contact email"
    }
  }
}

Smart Search

{
  "query": "project management software",
  "context": {"intent": "find_businesses"},
  "max_results": 25,
  "website_only": true,
  "skip_duplicate_domains": true
}

Async Workflow Integration

Eddie.surf is built for long-running workflows. Start a crawl job with a webhook callback, then continue your n8n workflow when data is ready:

  1. Start Crawl → Set callback_url to your n8n webhook
  2. Process in Background → Eddie.surf crawls and extracts data
  3. Webhook Triggers → Receive structured results, continue workflow

Perfect for large-scale data extraction that shouldn't block your workflow execution.

Why This Beats DIY Scraping

| Traditional Scraping | Eddie.surf | |---------------------|------------| | Write brittle CSS selectors | Describe what you want in plain English | | Handle pagination manually | AI automatically follows links | | Break on site updates | Adapts to layout changes | | Complex proxy management | Built-in IP handling |

Resources

License

MIT - see LICENSE.md for details.


Ready to surf the web for data? Install the node and start building smarter workflows with reliable web extraction. 🏄‍♂️