npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@openrol/maps-scrap

v0.1.0

Published

Powerful Maps scraping library and CLI built with Playwright

Readme

@openrol/maps-scrap

Maps scraping library and CLI built on Playwright.

It is designed to be usable in two ways:

  • as a small library with a clean scrapeMaps() API
  • as a CLI for quick dataset export to json, jsonl, or csv

Installation

npm install @openrol/maps-scrap playwright
npx playwright install chromium

Library Usage

import { scrapeMaps } from "@openrol/maps-scrap";

const result = await scrapeMaps({
  query: "coffee shops in Jakarta",
  limit: 25,
  headless: true,
});

console.log(result.collected);
console.log(result.places[0]);

Reusable scraper instance

import { createMapsScraper } from "@openrol/maps-scrap";

const scraper = createMapsScraper({
  headless: true,
  language: "en",
  maxScrolls: 80,
});

const restaurants = await scraper.scrape({
  query: "turkish restaurants in toronto",
  limit: 20,
});

Export scraped places

import { scrapeMaps, writePlaces } from "@openrol/maps-scrap";

const result = await scrapeMaps({
  query: "barber shops in Surabaya",
  limit: 50,
});

await writePlaces(result.places, {
  filePath: "./data/barbers.csv",
});

CLI Usage

npx maps-scrap --query "coffee shops in Jakarta" --limit 25 --output ./coffee.json

Common examples

npx maps-scrap --query "gyms in Surabaya" --limit 100 --output ./gyms.csv
npx maps-scrap --query "nail salons in Bali" --limit 50 --output ./nails.jsonl --append
npx maps-scrap --query "restaurants in Bandung" --limit 20 --headed --json

API

scrapeMaps(options)

Main function for scraping map listings.

Important options:

  • query: search phrase to run in the maps search UI
  • limit: number of places to collect, default 20
  • headless: run with or without a visible browser, default true
  • language: locale hint for Maps UI, default en
  • country: optional country hint
  • maxScrolls: maximum listing feed scroll passes, default 50
  • scrollDelayMs: delay between feed scrolls
  • listingDelayMs: delay after opening a place detail panel
  • navigationTimeoutMs: initial page/search timeout
  • detailTimeoutMs: place detail timeout
  • dedupe: remove duplicate listings by name + address, default true
  • logger: pass your own logger object
  • onProgress: receive progress callbacks while scraping

Returned shape:

type MapsScrapeResult = {
  query: string;
  requested: number;
  found: number;
  collected: number;
  durationMs: number;
  places: Place[];
};

writePlaces(places, options)

Writes scraped places to disk.

  • filePath: output location
  • format: optional json | jsonl | csv
  • append: supported for json and jsonl

If format is omitted, it is inferred from the file extension.

Extracted fields

Each Place contains:

  • name
  • address
  • website
  • phone_number
  • reviews_count
  • reviews_average
  • store_shopping
  • in_store_pickup
  • store_delivery
  • place_type
  • introduction

Publishing

Build the package before publishing:

bun run build
npm publish --access public

Notes

  • Target site markup changes over time. Selector maintenance is part of owning this package.
  • Use reasonable scrape volume and pacing to reduce blocking or rate limiting.
  • csv export currently writes the current run output; append mode is intended for json and jsonl.

License

MIT