npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@galihvsx/gmr-scraper

v1.0.1

Published

Modern, production-ready Google Maps review scraper with TypeScript, retry logic, caching, and CLI support

Readme

🗺️ GMR-Scraper

npm version license downloads build

Modern, production-ready Google Maps review scraper

Built with TypeScript • Powered by Bun • Feature-rich CLI

FeaturesInstallationQuick StartDocumentationExamples


✨ Features

Core Capabilities

  • 🎯 Scrape Google Maps Reviews - Extract reviews from any Google Maps location
  • 🔄 Smart Pagination - Automatic pagination with configurable limits
  • 🌐 Multi-Language Support - Scrape reviews in any language
  • 🔍 Advanced Filtering - Filter by rating, text, images, keywords
  • 📊 Built-in Analytics - Calculate ratings, distributions, and insights

Production-Ready

  • Retry Logic - Exponential backoff with configurable attempts
  • 💾 Caching Layer - Optional in-memory cache with TTL
  • 🚦 Rate Limiting - Token bucket algorithm to prevent API blocks
  • ⏱️ Timeout Control - Configurable request timeouts
  • 🎨 Progress Callbacks - Real-time scraping progress updates

Advanced Features

  • 📦 Batch Processing - Scrape multiple locations concurrently
  • 🌊 Streaming API - Memory-efficient processing for large datasets
  • 🖥️ CLI Tool - Command-line interface with multiple output formats
  • 🔧 Full TypeScript - Complete type safety and IntelliSense support
  • 🎯 Custom Errors - Detailed error classes for better debugging

📦 Installation

Using npm

npm install @galihvsx/gmr-scraper

Using Bun

bun add @galihvsx/gmr-scraper

Using Yarn

yarn add @galihvsx/gmr-scraper

CLI (Global)

npm install -g @galihvsx/gmr-scraper

🚀 Quick Start

Basic Usage

import { scraper } from "@galihvsx/gmr-scraper";

const reviews = await scraper("https://www.google.com/maps/place/...", {
  sort_type: "newest",
  pages: 5,
  clean: true,
  lang: "en",
});

console.log(`Found ${reviews.length} reviews`);

With Advanced Features

import { scraper } from "@galihvsx/gmr-scraper";

const reviews = await scraper(url, {
  sort_type: "newest",
  pages: 10,
  clean: true,

  cache: {
    enabled: true,
    ttl: 300000,
  },

  retry: {
    maxAttempts: 5,
    initialDelay: 2000,
  },

  rateLimit: {
    requestsPerSecond: 2,
  },

  onProgress: (current, total) => {
    console.log(`Scraping page ${current}/${total}`);
  },
});

CLI Usage

gmr-scraper scrape "https://www.google.com/maps/place/..." \
  --sort newest \
  --pages 5 \
  --clean \
  --output table

📖 Documentation

API Options

| Option | Type | Default | Description | | -------------- | --------------------------------------------------------------- | ------------ | -------------------------------------- | | sort_type | 'relevant' \| 'newest' \| 'highest_rating' \| 'lowest_rating' | 'relevant' | Sort order for reviews | | pages | number \| 'max' | 'max' | Number of pages to scrape | | search_query | string | '' | Filter reviews by text | | clean | boolean | false | Return parsed objects vs raw data | | lang | string | 'en' | Language code (e.g., 'en', 'id', 'es') | | cache | CacheOptions | undefined | Enable caching with TTL | | retry | RetryOptions | undefined | Configure retry logic | | rateLimit | RateLimitOptions | undefined | Configure rate limiting | | timeout | number | 30000 | Request timeout in ms | | onProgress | function | undefined | Progress callback |

Complete Guides


💡 Examples

Batch Processing

import { batchScraper } from "@galihvsx/gmr-scraper";

const results = await batchScraper([url1, url2, url3], {
  concurrency: 3,
  includeAnalytics: true,
  onProgress: (completed, total, url) => {
    console.log(`${completed}/${total}: ${url}`);
  },
});

Streaming API

import { scrapeStream } from "@galihvsx/gmr-scraper";

for await (const review of scrapeStream(url, { clean: true })) {
  console.log(`${review.author.name}: ${review.review.rating}⭐`);
}

Analytics & Filtering

import { calculateAnalytics, filterReviews } from "@galihvsx/gmr-scraper";

const reviews = await scraper(url, { clean: true });

const analytics = calculateAnalytics(reviews);
console.log(`Average rating: ${analytics.averageRating}`);

const highRated = filterReviews(reviews, {
  minRating: 4,
  hasText: true,
  keywords: ["excellent", "great"],
});

More Examples


🆚 Comparison with google-maps-review-scraper

| Feature | google-maps-review-scraper | @galihvsx/gmr-scraper | | ---------------------- | -------------------------- | ----------------------- | | Language | JavaScript | ✅ TypeScript | | Runtime | Node.js only | ✅ Bun + Node.js | | Type Safety | ❌ No types | ✅ Full TypeScript | | Retry Logic | ❌ No | ✅ Exponential backoff | | Rate Limiting | ❌ No | ✅ Token bucket | | Caching | ❌ No | ✅ In-memory cache | | Progress Callbacks | ❌ No | ✅ Real-time updates | | CLI Tool | ❌ No | ✅ Full-featured CLI | | Batch Processing | ❌ No | ✅ Concurrent scraping | | Streaming API | ❌ No | ✅ Memory-efficient | | Analytics | ❌ No | ✅ Built-in insights | | Filtering | ❌ No | ✅ Advanced filters | | Error Handling | Basic | ✅ Custom error classes | | Documentation | Basic | ✅ Comprehensive | | Examples | Limited | ✅ Extensive | | Build System | ❌ No | ✅ tsup bundling | | Dependencies | 1 | ✅ 0 (runtime) |


🎯 Why Choose GMR-Scraper?

1. Production-Ready

Built with enterprise features like retry logic, rate limiting, and caching out of the box.

2. Developer Experience

Full TypeScript support, comprehensive documentation, and extensive examples.

3. Performance

Optimized for Bun runtime with streaming API for memory-efficient processing.

4. Flexibility

Use as a library or CLI tool. Batch processing or streaming. Your choice.

5. Modern Stack

Latest TypeScript, modern async/await patterns, and best practices.


🛠️ Development

Setup

git clone https://github.com/galihvsx/gmr-scraper.git
cd gmr-scraper
bun install

Build

bun run build

Test

bun test
bun run test:coverage

Lint

bun run lint

🤝 Contributing

Contributions are welcome! Please read our Contributing Guide for details.

Contributors

Thanks to all contributors who have helped make this project better!


📄 License

MIT License - see LICENSE for details.


🙏 Acknowledgments


⚖️ Legal Disclaimer

This project is not affiliated with, endorsed by, or associated with Google LLC. All product and company names are trademarks of their respective holders.

Educational Purpose: This project is created for educational purposes and proof of concept. It demonstrates technical approaches for API integration and data processing.

Non-Commercial: This is a non-commercial, open-source project shared with the community to foster learning and collaboration.

Responsible Use: Users are responsible for complying with Google's Terms of Service and applicable laws when using this tool.


Made with ❤️ by Galih Putro Aji

⭐ Star this repo if you find it useful!

Report BugRequest FeatureDocumentation