npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

n8n-nodes-nextrows

v0.1.2

Published

n8n node to run NextRows web crawling apps and retrieve structured data

Downloads

205

Readme

n8n-nodes-nextrows

This is an n8n community node. It lets you use NextRows in your n8n workflows.

NextRows is a web crawling service that runs pre-configured crawling apps and returns structured data. Create or discover crawling apps on the NextRows platform, then execute them via this node to retrieve scraped data in JSON format.

n8n is a fair-code licensed workflow automation platform.

Installation
Operations
Credentials
Compatibility
Usage
Resources
Version history

Installation

Follow the installation guide in the n8n community nodes documentation.

Quick Install

In your n8n instance:

  1. Go to Settings > Community Nodes
  2. Select Install
  3. Enter n8n-nodes-nextrows
  4. Select Install

Operations

Run App (JSON)

Executes a published NextRows crawling app and returns the results as structured JSON data.

Parameters:

| Parameter | Type | Required | Description | |-----------|------|----------|-------------| | App ID | String | Yes | The ID of the NextRows app to run | | Inputs | Collection | No | Key-value pairs for app input parameters |

Output:

Each row from the crawled data is returned as a separate n8n item, allowing you to process each result individually in subsequent nodes.

Credentials

To use this node, you need a NextRows API key.

Getting Your API Key

  1. Sign up or log in at NextRows
  2. Navigate to Dashboard
  3. Copy your API key

Setting Up Credentials in n8n

  1. In n8n, go to Credentials
  2. Select Add Credential
  3. Search for NextRows API
  4. Paste your API key
  5. Save

Compatibility

  • Minimum n8n version: 1.0.0
  • Tested with: Latest n8n version

Usage

Basic Example

  1. Add the NextRows node to your workflow
  2. Configure your NextRows API credentials
  3. Enter the App ID of the crawling app you want to run
  4. (Optional) Add input parameters if the app requires them
  5. Execute the node

Finding Apps

Browse available crawling apps at the NextRows Marketplace.

Input Parameters

Many NextRows apps accept input parameters to customize the crawl. Common inputs include:

  • max-items - Maximum number of items to return
  • url - Target URL to crawl
  • Custom parameters defined by the specific app

Input values support:

  • Strings: "hello"
  • Numbers: 10 (enter without quotes)
  • Booleans: true or false
  • n8n expressions: {{ $json.myValue }}

Handling Long-Running Crawls

Web crawling can take anywhere from 10 seconds to 3 minutes depending on the app and target website. The node is configured with a 10-minute timeout to accommodate long-running operations.

Example Workflow

[Trigger] → [NextRows] → [Process Items] → [Output]
  1. Trigger: Start your workflow (manual, schedule, webhook, etc.)
  2. NextRows: Run a crawling app to extract data
  3. Process Items: Transform or filter the extracted data
  4. Output: Save to database, send to API, etc.

Resources

Version history

0.1.0

  • Initial release
  • Support for Run App (JSON) endpoint
  • Dynamic input parameters