npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

exportmapify

v0.1.3

Published

CLI tool to identify deep imports from internal packages in monorepos and generate package.json exports

Readme

Exportmapify

A CLI tool to identify deep imports from internal packages in monorepos and automatically generate package.json exports maps.

Installation

npm install -g exportmapify

Development

This project uses Yarn Berry managed by corepack:

corepack enable
yarn install
yarn build

Usage

Exportmapify uses a two-step workflow:

Step 1: Evaluate (Scan for imports)

Scan your repository to build a usage dictionary of package imports. The scanning process is additive - you can run it multiple times in different locations, and each scan will add new findings to the existing usage.json file without overwriting previous data:

# Scan current directory and create usage.json
exportmapify evaluate usage.json

# Scan specific directory (adds to existing usage.json)
exportmapify evaluate usage.json --cwd /path/to/repo

# Run multiple scans to build comprehensive usage data
exportmapify evaluate usage.json --cwd /path/to/frontend
exportmapify evaluate usage.json --cwd /path/to/backend
exportmapify evaluate usage.json --cwd /path/to/mobile

Step 2: Fix (Generate exports)

Generate exports maps for packages based on the usage data. The fix command only processes packages within the specified CWD tree - if you have scanned usage from multiple repositories, only packages that exist in the target directory will have their exports updated:

# Generate exports for packages in current directory (updates package.json files)
exportmapify fix usage.json

# Preview what would be generated without making changes
exportmapify fix usage.json --dry-run

# Generate exports for packages in specific directory only
# (packages from other scanned repos won't be affected)
exportmapify fix usage.json --cwd /path/to/packages

Single Repository Workflow

For a single monorepo, the workflow is straightforward. By default, both commands operate on all packages within the current working directory tree:

# 1. Scan the monorepo for all imports (scans all packages in CWD tree)
exportmapify evaluate usage.json --cwd /path/to/monorepo

# 2. Preview the exports that would be generated
exportmapify fix usage.json --dry-run --cwd /path/to/monorepo

# 3. Apply the exports to all package.json files (only packages in CWD tree)
exportmapify fix usage.json --cwd /path/to/monorepo

Multi-Repository Workflow

For monorepos split across multiple repositories, use the --main-repo flag to specify which packages to track:

# 1. Scan the main repository (contains the packages you want to generate exports for)
exportmapify evaluate usage.json --cwd /path/to/main-repo

# 2. Scan consumer repositories (only tracks imports to main repo packages)
exportmapify evaluate usage.json --cwd /path/to/consumer-repo1 --main-repo /path/to/main-repo
exportmapify evaluate usage.json --cwd /path/to/consumer-repo2 --main-repo /path/to/main-repo

# 3. Generate exports for packages in the main repo
exportmapify fix usage.json --cwd /path/to/main-repo

Example with real paths:

# 1. Scan main-repo (main repo with @company/* packages)
exportmapify evaluate usage.json --cwd /Users/dev/projects/main-repo

# 2. Scan app-frontend (consumer repo that imports @company/* packages)
exportmapify evaluate usage.json --cwd /Users/dev/projects/app-frontend --main-repo /Users/dev/projects/main-repo

# 3. Scan mobile-app (another consumer repo)
exportmapify evaluate usage.json --cwd /Users/dev/projects/mobile-app --main-repo /Users/dev/projects/main-repo

# 4. Generate exports only for main-repo packages
exportmapify fix usage.json --cwd /Users/dev/projects/main-repo

Usage Data Format

The evaluate command creates a usage dictionary tracking which packages are imported and how:

{
  "@company/ui-components": {
    "package": "@company/ui-components",
    "versionRequirement": "*",
    "importPaths": [
      ".",
      "./lib/Button",
      "./lib/Modal",
      "./lib/TextField"
    ]
  },
  "@company/utils": {
    "package": "@company/utils",
    "versionRequirement": "^2.1.0",
    "importPaths": [
      ".",
      "./lib/formatDate",
      "./lib/validateEmail"
    ]
  }
}

Generated Exports Format

The fix command generates proper exports maps with conditional exports:

{
  "name": "@company/ui-components",
  "exports": {
    ".": {
      "types": "./lib/index.d.ts",
      "default": "./lib/index.js",
      "require": "./lib/index.js"
    },
    "./lib/Button": {
      "types": "./lib/lib/Button.d.ts",
      "default": "./lib/lib/Button.js"
    },
    "./lib/Modal": {
      "types": "./lib/lib/Modal.d.ts",
      "default": "./lib/lib/Modal.js"
    }
  }
}

How Multi-Repository Analysis Works

Main Repository Scanning

When you scan the main repository (without --main-repo), the tool:

  • Discovers all packages in the repository automatically
  • Tracks ALL imports (internal and external dependencies)
  • Creates comprehensive usage data for the entire monorepo

Consumer Repository Scanning

When you scan a consumer repository (with --main-repo), the tool:

  • Discovers all packages in the main repository
  • Only tracks imports that target packages from the main repository
  • Ignores imports to external dependencies and other packages
  • Merges findings with existing usage data

Export Generation

The fix command:

  • Only processes packages that exist in the target directory
  • Generates exports maps for all packages with usage data (including root-only imports)

Features

  • Two-step workflow: Separate analysis and generation phases for flexibility
  • Cross-repository analysis: Scan multiple repos while filtering for specific package imports
  • Smart exports generation: Automatically infers exports from package.json fields (main, module, types, browser)
  • File detection: Finds actual files in lib/, dist/, src/ directories with proper TypeScript → JavaScript mapping
  • Version tracking: Captures version requirements from package.json dependencies
  • High performance: Optimized parallel processing for large monorepos (100k+ files)
  • Comprehensive file support: TypeScript (.ts, .tsx, .mts, .cts), JavaScript (.js, .jsx, .cjs, .mjs)
  • Import detection: ES6 imports, CommonJS requires, and dynamic imports
  • Conditional exports: Generates proper types, import, require, default, and browser fields
  • Selective tracking: Only tracks packages you care about when scanning consumer repositories
  • Data merging: Multiple scans contribute to the same usage file for comprehensive analysis

Key Behaviors

Additive Scanning

The evaluate command is additive - running it multiple times will:

  • Merge new import findings with existing usage data
  • Preserve previously discovered imports
  • Update version requirements if new ones are found
  • Allow building comprehensive usage data across multiple scans

CWD Tree Scoping

Commands operate within the specified working directory tree:

  • evaluate: Scans all source files within the CWD tree (ignoring node_modules, dist, etc.)
  • fix: Only updates package.json files for packages that exist within the CWD tree
  • This ensures that scanning usage in one repository doesn't accidentally modify packages in another repository

Performance

Exportmapify is optimized for large monorepos with efficient parallel processing and smart file filtering. The tool can handle repositories with hundreds of thousands of files while maintaining fast scan times.

License

MIT