npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

webgl-forensics

v3.1.0

Published

The definitive 24-phase reverse engineering engine for WebGL, Three.js, GSAP, and animated websites. Clone any site.

Readme

webgl-forensics

The definitive 24-phase reverse-engineering engine for WebGL, Three.js, R3F, GSAP, and animated websites.

Analyze any creative site and extract everything — 3D scenes, shader programs, GSAP timelines, scroll systems, design tokens, fonts, and performance metrics. Then auto-generate a working Next.js starter from the results, or ask an AI to write the reconstruction guide for you.

npx webgl-forensics https://cappen.com --scaffold --ai-report

Table of Contents


Quick Start

No install needed:

# Analyze any site instantly
npx webgl-forensics https://site.com

# Interactive menu (recommended for first use)
npx webgl-forensics --interactive

# Full cloning pipeline — forensics + scaffold + AI guide
npx webgl-forensics https://site.com --scaffold --ai-report --screenshots

# Analyze WebGL shaders (needs visible browser)
npx webgl-forensics https://site.com --annotate-shaders --no-headless

Installation

Run without installing (recommended)

npx webgl-forensics [URL] [options]

Install globally

npm install -g webgl-forensics
webgl-forensics [URL] [options]

Install locally in a project

npm install webgl-forensics
npx webgl-forensics [URL] [options]

Prerequisites

| Requirement | Version | Notes | |-------------|---------|-------| | Node.js | ≥ 18.0.0 | Required | | Chromium | Auto-installed | Bundled with Puppeteer | | ANTHROPIC_API_KEY | Optional | For --ai-report and --annotate-shaders | | GEMINI_API_KEY | Optional | Alternative to Anthropic |


Usage

Interactive Mode (TUI)

The easiest way to run. Presents a full terminal menu to configure your forensics run:

npx webgl-forensics --interactive

You'll be prompted for:

  • Target URL
  • Analysis scope (all, 3D, animations, scroll, layout)
  • Feature toggles (scaffold, AI report, screenshots, etc.)
  • Browser mode (headless vs. visible)
  • Optional compare URL
  • Output directory

Real-time phase progress is shown with spinners and completion status.


CLI Mode

Minimal run

npx webgl-forensics https://cappen.com

Full cloning pipeline

npx webgl-forensics https://cappen.com \
  --scaffold \
  --ai-report \
  --sitemap \
  --screenshots \
  --no-headless \
  --format html \
  --history

WebGL shader analysis

# Visible browser is required for some WebGL sites to actually render
npx webgl-forensics https://site.com \
  --annotate-shaders \
  --focus 3d \
  --no-headless

Fidelity check against your clone

npx webgl-forensics https://site.com \
  --compare http://localhost:3000 \
  --screenshots

CI/CD dry run

npx webgl-forensics https://site.com --dry-run

All Flags Reference

v3.1 (new)

| Flag | Description | |------|-------------| | --interactive | Launch interactive TUI to configure the run | | --scaffold | Generate a Next.js 15 + TypeScript starter from forensics output | | --ai-report | Pipe forensics JSON to Claude/Gemini → Markdown reconstruction guide | | --annotate-shaders | AI-annotate each extracted GLSL shader (math pattern, uniforms, recipe) | | --sitemap | Parse /sitemap.xml for page discovery instead of DOM link crawl |

v3.0

| Flag | Description | |------|-------------| | --compare [URL] | Run forensics on two URLs and produce a fidelity diff | | --record | Record an MP4 video of the full scroll session | | --download-assets | Download detected .glb, .gltf, .woff2, .hdr, .ttf files | | --format html | Output timeline as standalone HTML visualizer | | --history | Append run complexity score and tech stack to run-history.json |

v2.0

| Flag | Description | |------|-------------| | --focus [scope] | all (default), 3d, animations, scroll, layout | | --output [path] | Output directory (default: ./forensics-output) | | --headless | Run in headless mode (default) | | --no-headless | Launch visible browser — some WebGL sites require this | | --screenshots | Capture 11 scroll-position screenshots | | --multipage | Crawl all internal pages detected via sitemap or DOM | | --timeout [ms] | Navigation timeout in ms (default: 30000) | | --dry-run | Preview phases and flags without fetching anything |


Feature Deep Dives

--scaffold: Next.js Generator

Generates a complete, runnable Next.js 15 App Router project pre-wired based on what the forensics found:

npx webgl-forensics https://site.com --scaffold

What gets generated in ./forensics-output/[domain]/scaffold/:

scaffold/
├── package.json          ← Auto-detected deps (gsap, r3f, lenis, etc.)
├── tsconfig.json
├── next.config.ts
├── src/
│   ├── app/
│   │   ├── globals.css   ← CSS variables extracted from site's design tokens
│   │   ├── layout.tsx
│   │   ├── page.tsx      ← Home page stub
│   │   ├── work/page.tsx ← Pages from sitemap (if available)
│   │   └── about/page.tsx
│   ├── lib/
│   │   ├── gsap-setup.ts ← GSAP + auto-detected plugins pre-configured
│   │   └── lenis.ts      ← Smooth scroll wired to GSAP ticker
│   └── components/
│       ├── canvas.tsx    ← R3F Canvas with detected scene setup
│       └── preloader.tsx ← Preloader (if site had one)
└── README.md

Then just:

cd forensics-output/[domain]/scaffold
npm install
npm run dev

--ai-report: Reconstruction Guide

Pipes the forensics JSON to Claude (preferred) or Gemini and gets back a detailed Markdown guide for rebuilding the site:

ANTHROPIC_API_KEY=sk-ant-xxx npx webgl-forensics https://site.com --ai-report

Saved as: reconstruction-guide.md

The guide includes:

  1. Site Overview — stack summary, complexity rating, what drives it
  2. Project Setup — exact install commands, file structure
  3. Design System — actual hex colors from tokens, typography config, Tailwind snippet
  4. Animation System — GSAP plugin setup, ScrollTrigger patterns, sample tween reconstructions
  5. 3D / WebGL Layer — scene setup, detected geometries, camera config
  6. Loading & Transitions — preloader and page transition implementation
  7. Critical Notes — gotchas, performance tips, order of operations

Supports both:

  • ANTHROPIC_API_KEY → Claude Opus (highest quality)
  • GEMINI_API_KEY → Gemini 1.5 Pro (alternative)

--annotate-shaders: GLSL Annotator

Extracts GLSL shader programs from WebGL context and annotates them:

ANTHROPIC_API_KEY=sk-ant-xxx npx webgl-forensics https://site.com \
  --annotate-shaders --no-headless

Saved as: shader-annotations.json

Each shader gets:

{
  "type": "fragment",
  "lineCount": 124,
  "uniforms": [
    { "type": "float", "name": "uTime" },
    { "type": "vec2", "name": "uResolution" }
  ],
  "localPatterns": [
    { "name": "Fractional Brownian Motion (fBm)", "desc": "Layered noise for organic textures" }
  ],
  "aiAnnotation": {
    "visualEffect": "Animated fluid noise distortion on a screen-space plane",
    "mathPattern": "Fractional Brownian Motion with 6 octaves",
    "complexity": "moderate",
    "uniforms": [
      { "name": "uTime", "role": "Drives the animation — increment each frame" },
      { "name": "uResolution", "role": "Normalizes UV coordinates to aspect ratio" }
    ],
    "recreationRecipe": "1. Set up a fullscreen plane ..."
  }
}

Offline fallback: Even without an API key, the tool detects math patterns from GLSL source (FBM, Voronoi, SDF, ray marching, chromatic aberration, etc.) and extracts uniforms and varyings.


--interactive: TUI Dashboard

npx webgl-forensics --interactive

Full terminal UI with:

  • Prompt-based configuration — URL, scope, feature toggles, browser mode
  • Real-time spinner per phase — shows success/fail + details as each phase runs
  • Completion summary — URL, output dir, complexity score, framework detected

No config file needed. Great for one-off runs when you want control over what runs.


--sitemap: Sitemap Crawler

Replaces DOM anchor-crawling with a smarter sitemap-first approach:

npx webgl-forensics https://site.com --sitemap --multipage

Discovery hierarchy:

  1. Tries /sitemap.xml
  2. Falls back to /sitemap_index.xml → parses child sitemaps
  3. Falls back to /page-sitemap.xml
  4. Falls back to DOM anchor-crawl if none found

Categorizes URLs by type:

  • homepage/
  • work/work, /projects, /portfolio, /case-*
  • about/about, /team, /story
  • contact/contact, /hire
  • blog/blog, /posts, /articles
  • other — everything else

URLs sorted by <priority> tag. Phase output includes topPriority[] for the most important pages.


--compare: Fidelity Diffing

Runs forensics on two URLs and produces a structured comparison:

npx webgl-forensics https://original.com --compare http://localhost:3000

Output: comparison-diff.json

{
  "fidelityScore": 87.5,
  "techOverlap": ["gsap", "three", "r3f", "lenis"],
  "missingInClone": ["framer-motion"]
}

fidelityScore is 0–100. 100 = identical complexity signature.


--record: Video Capture

Records a scrolling video of the site:

npx webgl-forensics https://site.com --record --no-headless

Saves to scroll-recording.mp4 in the output directory. The runner scrolls through 10 positions across the full page height, pausing 1s at each, then returns to top.


The 24 Forensics Phases

| # | Phase | Script | What It Extracts | |---|-------|--------|-----------------| | 0 | Tech Stack | 00-tech-stack-detect.js | Framework, libraries, animation/scroll systems | | 1 | Source Maps | 01-source-map-extractor.js | Webpack chunks, original source references | | 2 | Interaction Model | 02-interaction-model.js | Event listeners, cursor logic, hover behavior | | 3 | Responsive Analysis | 03-responsive-analysis.js | Breakpoints, media queries, viewport behaviors | | 4 | Page Transitions | 04-page-transitions.js | Barba.js, custom router transitions, exit/enter hooks | | 5 | Loading Sequence | 05-loading-sequence.js | Preloader presence, selector, estimated duration | | 6 | Audio Extraction | 06-audio-extraction.js | AudioContext, <audio> elements, Howler.js | | 7 | Accessibility | 07-accessibility-reduced-motion.js | prefers-reduced-motion handling, ARIA patterns | | 8 | Complexity Scorer | 08-complexity-scorer.js | Weighted composite score (0–100) | | 9 | Visual Diff | 09-visual-diff-validator.js | Pixel-level screenshot diff (used by --compare) | | 10 | WebGPU Extractor | 10-webgpu-extractor.js | Adapter info, WGSL pipelines, bind groups | | 12 | Network Waterfall | 12-network-waterfall.js | Request timing, resource types, CDN origins | | 13 | React Fiber Walker | 13-react-fiber-walker.js | Full React component tree from __reactFiber | | 14 | Shader Hot-Patch | 14-shader-hotpatch.js | Intercepts compileShader → captures all GLSL | | 15 | MultiPage Crawler | 15-multipage-crawler.js | BFS crawl of all internal links | | 16 | GSAP Timeline Recorder | 16-gsap-timeline-recorder.js | All tweens, ScrollTriggers, timelines, durations | | 17 | R3F Fiber Serializer | 17-r3f-fiber-serializer.js | Three.js scene graph, materials, geometries | | 18 | Font Extractor | 18-font-extractor.js | Web fonts, CSS @font-face, Google Fonts | | 19 | Design Token Export | 19-design-token-export.js | CSS variables, sampled colors, Tailwind scaffold | | 20 | Timeline Visualizer | 20-timeline-visualizer.js | HTML visualization of GSAP timeline data | | 21 | Lighthouse Audit | 21-lighthouse-audit.js | Performance, accessibility, SEO, best practices | | 22 | Sitemap Crawler | 22-sitemap-crawler.js | /sitemap.xml → prioritized URL list | | 23 | Scaffold Generator | 23-scaffold-generator.js | Next.js 15 starter from forensics output | | 24 | AI Reconstruction | 24-ai-reconstruction.js | Claude/Gemini rebuild guide | | 25 | Shader Annotator | 25-shader-annotator.js | AI GLSL annotation per shader program |


Output Structure

Each run creates a timestamped directory:

forensics-output/
└── site-com-2026-04-03-01-00/
    ├── forensics-report.json      ← Master report — all phases combined
    ├── tech-stack.json
    ├── gsap-timeline.json
    ├── react-fiber-walker.json
    ├── design-token-export.json
    ├── sitemap-crawler.json
    ├── shader-annotations.json    ← (--annotate-shaders)
    ├── lighthouse.json            ← (Lighthouse)
    ├── timeline.html              ← (--format html)
    ├── reconstruction-guide.md    ← (--ai-report)
    ├── comparison-diff.json       ← (--compare)
    ├── scroll-recording.mp4       ← (--record)
    ├── screenshots/               ← (--screenshots)
    │   ├── scroll-0pct.png
    │   ├── scroll-10pct.png
    │   └── ...
    ├── assets/                    ← (--download-assets)
    │   ├── model.glb
    │   └── font.woff2
    └── scaffold/                  ← (--scaffold)
        ├── package.json
        ├── src/app/globals.css
        ├── src/app/layout.tsx
        ├── src/lib/gsap-setup.ts
        └── src/components/canvas.tsx

Architecture

puppeteer-runner.js          ← CLI entry point + orchestrator
│
├── tui.js                   ← Interactive TUI (chalk + ora + inquirer)
│
└── scripts/
    ├── 00–21               ← Browser-evaluated extraction scripts
    │   (All scripts run inside Puppeteer's page context via page.evaluate())
    │
    ├── 22-sitemap-crawler.js       ← Browser: async fetch + XML parse
    ├── 23-scaffold-generator.js    ← Node: file system scaffold writer
    ├── 24-ai-reconstruction.js     ← Node: HTTPS → Claude/Gemini API
    └── 25-shader-annotator.js      ← Node: offline pattern detect + AI

Key design decisions:

  • Scripts 00–22 are browser-evaluated (run inside Puppeteer's page context)
  • Scripts 23–25 are Node.js modules (run server-side from the runner)
  • All new phase results are attached to the master forensics-report.json
  • Everything degrades gracefully — missing API keys → offline fallback, missing packages → skip phase

Environment Variables

| Variable | Used By | Purpose | |----------|---------|---------| | ANTHROPIC_API_KEY | --ai-report, --annotate-shaders | Claude API access (preferred) | | GEMINI_API_KEY | --ai-report, --annotate-shaders | Google Gemini API (alternative) |

Set before running:

export ANTHROPIC_API_KEY=sk-ant-your-key-here
npx webgl-forensics https://site.com --ai-report --annotate-shaders

Or inline:

ANTHROPIC_API_KEY=sk-ant-xxx npx webgl-forensics https://site.com --ai-report

CI/CD Integration

A GitHub Action is included at .github/workflows/forensics.yml. It runs automatically on every PR to main:

# .github/workflows/forensics.yml
name: WebGL Forensics CI

on:
  pull_request:
    branches: [main, develop]

jobs:
  forensics:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-node@v4
        with: { node-version: '20' }
      - run: npm install
      - run: |
          npx webgl-forensics $SOURCE_URL \
            --compare $COMPARE_URL \
            --output ./forensics-output \
            --headless
        env:
          SOURCE_URL: ${{ vars.FORENSICS_SOURCE_URL }}
          COMPARE_URL: ${{ vars.FORENSICS_COMPARE_URL }}
      - uses: actions/upload-artifact@v4
        with:
          name: forensics-report
          path: forensics-output/

Set FORENSICS_SOURCE_URL (original site) and FORENSICS_COMPARE_URL (your local or staging URL) as GitHub repository variables.


Contributing: Writing Custom Phases

Any .js file starting with custom- in the scripts/ folder is auto-discovered and run:

scripts/custom-my-extractor.js

Your script must be a self-invoking function that returns a serializable value:

// scripts/custom-my-extractor.js
(() => {
  return {
    myData: document.title,
    windowWidth: window.innerWidth,
  };
})()

The runner will:

  1. Auto-discover it at startup
  2. Execute it via page.evaluate() in the browser context
  3. Save output as custom-my-extractor.json
  4. Include it in forensics-report.json

No registration needed. Drop the file and run.


Requirements

  • Node.js ≥ 18.0.0
  • Chromium — automatically installed with Puppeteer
  • macOS / Linux / Windows — all supported
  • GPU — optional but improves WebGL rendering fidelity with --no-headless

Optional dependencies (auto-installed with npm)

| Package | Used For | |---------|----------| | puppeteer | Browser automation | | puppeteer-screen-recorder | --record MP4 output | | lighthouse | Lighthouse audit phase | | chalk | Colored output in TUI | | ora | Spinner progress in TUI | | inquirer | Interactive prompts in --interactive |


License

MIT © 404kidwiz


Related