npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

supatool

v0.4.2

Published

CLI for Supabase: extract schema (tables, views, RLS, RPC) to files + llms.txt for LLM, deploy local schema, seed export. CRUD code gen deprecated.

Readme

Supatool

The AI-Native Schema Management CLI for Supabase. Extract database schemas into LLM-friendly structures, generate llms.txt catalogs, and manage seeds without drowning your AI's context.

npm version License: MIT

Why Supatool?

Modern AI coding tools (Cursor, Claude, MCP) often struggle with large database schemas. Typical issues include:

  • Token Waste: Reading the entire schema at once consumes 10k+ tokens.
  • Lost Context: Frequent API calls to fetch table details via MCP lead to fragmented reasoning.
  • Inaccuracy: AI misses RLS policies or complex FK relations split across multiple files.

Supatool solves this by reorganizing your Supabase schema into a highly searchable, indexed, and modular structure that helps AI "understand" your DB with minimal tokens.


Key Features

  • Extract (AI-Optimized) – DDL, RLS, and Triggers are bundled into one file per table. AI gets the full picture of a table by opening just one file.
  • llms.txt Catalog – Automatically generates a standard llms.txt listing all OBJECTS, RELATIONS (FKs), and RPC dependencies. This serves as the "Map" for AI agents.
  • Multi-Schema Support – Group objects by schema (e.g., public, agent, auth) with proper schema-qualification in SQL.
  • Seed for AI – Export table data as JSON. Includes a dedicated llms.txt for seeds so AI can see real data structures.
  • Safe Deploy – Push local schema changes with --dry-run to preview DDL before execution.
  • CRUD (Deprecated) – Legacy code generation is still available but discouraged in favor of LLM-native development.

Quick Start

npm install -g supatool
# Set your connection string
export SUPABASE_CONNECTION_STRING="postgresql://postgres:[password]@db.[ref].supabase.co:5432/postgres"

# Extract schema and generate AI-ready docs
supatool extract --schema public,auth -o supabase/schemas

Output Structure

supabase/schemas/
├── llms.txt          # 🗺️ THE ENTRY POINT: Read this first to understand the DB map
├── schema_index.json # 🤖 For JSON-parsing agents
├── schema_summary.md # 📄 Single-file overview for quick human/AI scanning
├── README.md         # Navigation guide
└── [schema_name]/
    ├── tables/       # table_name.sql (DDL + RLS + Triggers)
    ├── views/
    └── rpc/

Best Practices for AI Agents (Cursor / Claude / MCP)

To get the best results from your AI coding assistant, follow these steps:

  1. Start with the Map: Always ask the AI to read supabase/schemas/llms.txt first.
  2. Targeted Reading: Once the AI identifies the relevant tables from the catalog, instruct it to open only those specific .sql files.
  3. Understand Relations: Use the RELATIONS section in llms.txt to help the AI write accurate JOINs without reading every file.
  4. RPC Context: If using functions, refer to RPC_TABLES in llms.txt to know which tables are affected.

Commands

Extract

supatool extract --all -o supabase/schemas
# Options:
# --schema public,agent   Specify schemas
# -t "user_*"             Filter tables by pattern
# --force                 Clear output dir before writing (prevents orphan files)

Seed

Export specific tables for AI reference or testing:

supatool seed --tables tables.yaml

Outputs JSON files and a llms.txt index in supabase/seeds/.

Deploy

supatool deploy --dry-run

Repository

GitHub · npm


Developed with ❤️ for the Supabase community. Use at your own risk. Always backup your DB before deployment.