npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

rcrawler

v1.0.6

Published

A Node.js Minecraft server crawler that connects to offline-mode servers, captures raw chunk data packets, stores them as .bin files, and converts them into Anvil (.mca) region files when needed. Now features a programatic API.

Readme

RCrawler

A Node.js Minecraft server crawler that connects to offline-mode servers, captures raw chunk data packets, stores them as .bin files, and converts them into Anvil (.mca) region files. Supports both CLI usage and a programmatic, library-first API.

npm version License: GPL-3.0


Quick Example

Get started in seconds with a minimal crawler:

const createCrawler = require('rcrawler');

const bot = createCrawler({
  host: '127.0.0.1',
  username: 'CrawlerBot',
  protocol: 774 // Minecraft 1.21.11
});

bot.on('chunk', (chunk) => {
  console.log(`Captured chunk at [${chunk.x}, ${chunk.z}]`);
});

bot.start();

Installation

Install globally for CLI usage or locally for your project:

# Global installation for CLI
npm install -g rcrawler

# Local installation for library use
npm install rcrawler

Programmatic API

The library provides an EventEmitter based API similar to mineflayer, allowing for granular control over the crawling process.

const createCrawler = require('rcrawler');

const bot = createCrawler({
  host: '127.0.0.1',
  port: 25565,
  username: 'ChunkGetter',
  protocol: 774,

  // Storage Settings
  outputDir: 'output', // Where raw .bin chunks go
  worldDir: 'world',   // Where converted .mca files go

  // Behavior Settings
  viewDistance: 64,
  connectTimeout: 10,
  readTimeout: 180,
  maxCaptureSeconds: 300,

  logLevel: 'INFO'
});

// Event Listeners
bot.on('connect', () => console.log('[+] Connected to server'));
bot.on('chunk', (chunk) => console.log(`[+] Received: X=${chunk.x}, Z=${chunk.z}`));
bot.on('error', (err) => console.error('[!] Error:', err.message));

bot.on('end', async () => {
  console.log('[+] Crawl finished. Starting conversion...');

  // Optional: Convert captured data to Anvil format immediately
  await bot.convertChunks();
  
  console.log('[+] World export complete.');
});

// Execute
bot.start();

Options

The createCrawler(options) function accepts the following configuration:

| Option | Type | Default | Description | | :--- | :--- | :--- | :--- | | host | string | '127.0.0.1' | Server IP address | | port | number | 25565 | Server port | | username | string | 'Crawler' | Bot username (offline mode) | | protocol | number | 774 | Minecraft protocol version | | outputDir | string | 'output' | Path to store raw chunk binary files | | worldDir | string | 'world' | Path to store exported Anvil region files | | viewDistance | number | 127 | Simulated render distance | | connectTimeout| number | 10.0 | Connection timeout in seconds | | readTimeout | number | 180.0 | Socket read timeout in seconds | | maxCaptureSeconds| number | 180.0 | Auto-stop after this duration | | logLevel | string | 'INFO' | Console verbosity (DEBUG, INFO, WARN, ERROR) |


Events

The Crawler instance emits the following events:

  • connect: Emitted when the TCP connection is established.
  • chunk: Emitted every time a chunk packet is successfully parsed and saved. Receives a chunk object with x and z coordinates.
  • error: Emitted on connection failures or protocol errors.
  • end: Emitted when the crawler finishes (due to timeout, server disconnect, or manual stop).

CLI Usage

rcrawler comes with a powerful command-line interface for manual operations.

Capture Chunks

Connect to a server and start capturing data:

rcrawler --crawl --ip 127.0.0.1 --username Getter --protocol 772

Convert Chunks

Convert your binary captures into Anvil (.mca) region files:

rcrawler --convert-chunks --chunks-dir output/chunks --world-dir world

Generate Protocol Assets

If you need to support a new Minecraft version, you can generate the required protocol assets:

rcrawler --generate-protocol-assets --minecraft-version 1.20.1 --protocol 763

Technical Reference

Supported Protocols

rcrawler uses dynamic protocol mapping. Currently tested and supported:

  • 774: Minecraft 1.21.11
  • 773: Minecraft 1.21.9 / 1.21.10
  • 772: Minecraft 1.21.7 / 1.21.8
  • 771: Minecraft 1.21.6
  • 770: Minecraft 1.21.5
  • 763: Minecraft 1.20.1

Project Structure

  • src/index.js: API entry point
  • src/crawler.js: Main lifecycle and event management
  • src/client.js: Low-level protocol state machine
  • src/chunk_handler.js: Packet parsing and storage logic
  • src/convert.js: World export implementation
  • src/anvil.js: NBT rebuilding and MCA writing

Capture File Format (.bin)

Each chunk is saved in a custom container optimized for speed:

  1. Magic Header: MCCAP001
  2. Length Prefix: Three 32-bit BE integers for metadata and payload sizes.
  3. JSON Metadata: Coordinates, timestamp, version info, and heightmaps.
  4. Payload: Raw level_chunk_with_light packet data.
  5. Section Data: Sliced block-state and biome containers.

Limitations

  • Offline Mode Only: The client does not currently support Mojang/Microsoft authentication.
  • Biomes: Biome IDs are captured, but exported as minecraft:plains by default unless a custom biome registry mapping is provided.

Contributing

This is a fully open-source project and anyone is welcome to contribute! Whether it's fixing a bug, adding a new protocol version, or improving documentation, your help is appreciated.

How to Contribute

  1. Fork the repository.
  2. Clone your fork:
    git clone https://github.com/rikgdi/RCrawler-JS.git
  3. Create a branch for your changes:
    git checkout -b feature/AmazingFeature
  4. Commit your changes:
    git commit -m 'Add some AmazingFeature'
  5. Push to the branch:
    git push origin feature/AmazingFeature
  6. Open a Pull Request and describe your changes.

License

This project is licensed under the GPL-3.0 License. See the LICENSE file for details.