supatool
v0.4.2
Published
CLI for Supabase: extract schema (tables, views, RLS, RPC) to files + llms.txt for LLM, deploy local schema, seed export. CRUD code gen deprecated.
Maintainers
Readme
Supatool
The AI-Native Schema Management CLI for Supabase. Extract database schemas into LLM-friendly structures, generate llms.txt catalogs, and manage seeds without drowning your AI's context.
Why Supatool?
Modern AI coding tools (Cursor, Claude, MCP) often struggle with large database schemas. Typical issues include:
- Token Waste: Reading the entire schema at once consumes 10k+ tokens.
- Lost Context: Frequent API calls to fetch table details via MCP lead to fragmented reasoning.
- Inaccuracy: AI misses RLS policies or complex FK relations split across multiple files.
Supatool solves this by reorganizing your Supabase schema into a highly searchable, indexed, and modular structure that helps AI "understand" your DB with minimal tokens.
Key Features
- Extract (AI-Optimized) – DDL, RLS, and Triggers are bundled into one file per table. AI gets the full picture of a table by opening just one file.
- llms.txt Catalog – Automatically generates a standard
llms.txtlisting all OBJECTS, RELATIONS (FKs), and RPC dependencies. This serves as the "Map" for AI agents. - Multi-Schema Support – Group objects by schema (e.g.,
public,agent,auth) with proper schema-qualification in SQL. - Seed for AI – Export table data as JSON. Includes a dedicated
llms.txtfor seeds so AI can see real data structures. - Safe Deploy – Push local schema changes with
--dry-runto preview DDL before execution. - CRUD (Deprecated) – Legacy code generation is still available but discouraged in favor of LLM-native development.
Quick Start
npm install -g supatool
# Set your connection string
export SUPABASE_CONNECTION_STRING="postgresql://postgres:[password]@db.[ref].supabase.co:5432/postgres"
# Extract schema and generate AI-ready docs
supatool extract --schema public,auth -o supabase/schemas
Output Structure
supabase/schemas/
├── llms.txt # 🗺️ THE ENTRY POINT: Read this first to understand the DB map
├── schema_index.json # 🤖 For JSON-parsing agents
├── schema_summary.md # 📄 Single-file overview for quick human/AI scanning
├── README.md # Navigation guide
└── [schema_name]/
├── tables/ # table_name.sql (DDL + RLS + Triggers)
├── views/
└── rpc/
Best Practices for AI Agents (Cursor / Claude / MCP)
To get the best results from your AI coding assistant, follow these steps:
- Start with the Map: Always ask the AI to read
supabase/schemas/llms.txtfirst. - Targeted Reading: Once the AI identifies the relevant tables from the catalog, instruct it to open only those specific
.sqlfiles. - Understand Relations: Use the
RELATIONSsection inllms.txtto help the AI write accurate JOINs without reading every file. - RPC Context: If using functions, refer to
RPC_TABLESinllms.txtto know which tables are affected.
Commands
Extract
supatool extract --all -o supabase/schemas
# Options:
# --schema public,agent Specify schemas
# -t "user_*" Filter tables by pattern
# --force Clear output dir before writing (prevents orphan files)
Seed
Export specific tables for AI reference or testing:
supatool seed --tables tables.yaml
Outputs JSON files and a llms.txt index in supabase/seeds/.
Deploy
supatool deploy --dry-run
Repository
Developed with ❤️ for the Supabase community. Use at your own risk. Always backup your DB before deployment.
