npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@bhagat-surya-dev/dashchat-database-manager

v1.0.12

Published

AI-powered database schema analysis and management library

Readme

@bhagat-surya-dev/dashchat-database-manager

AI-powered database schema analysis and management library with support for multiple database types.

Installation

npm install @bhagat-surya-dev/dashchat-database-manager

Peer Dependencies

Install the database driver for your specific database:

# For PostgreSQL
npm install pg @types/pg

# For MySQL/MariaDB
npm install mysql2

# For SQLite
npm install sqlite3

# For MongoDB
npm install mongodb

Quick Start

import DatabaseManager from '@bhagat-surya-dev/dashchat-database-manager';

// Initialize the manager
const dbManager = new DatabaseManager({
  cerebrasApiKey: 'your-cerebras-api-key',
  databaseUrl: 'postgresql://user:pass@localhost:5432/mydb' // Optional
});

// Test connection
const isConnected = await dbManager.testConnection();
console.log('Connected:', isConnected);

// Get schema information
const schema = await dbManager.getSchemaInfo();
console.log('Tables:', schema.tables.length);

// AI-powered schema analysis
const analyzedSchema = await dbManager.analyzeAndCacheSchema(
  'postgresql://user:pass@localhost:5432/mydb',
  'my-database-id'
);

Constructor Options

DbManagerOptions

| Parameter | Type | Required | Description | |-----------|------|----------|-------------| | cerebrasApiKey | string | Yes | Your Cerebras Cloud API key for AI analysis | | databaseUrl | string | No | Default database connection string (can be provided later to methods) | | supabaseConfig | SupabaseConnectionConfig | No | Secure Supabase authentication (alternative to databaseUrl) | | model | string | No | AI model to use (defaults to "llama-3.3-70b") |

Example

// Traditional database URL approach
const dbManager = new DatabaseManager({
  cerebrasApiKey: process.env.CEREBRAS_API_KEY!,
  databaseUrl: process.env.DATABASE_URL // Optional default
});

// Secure Supabase configuration approach (recommended)
const dbManager = new DatabaseManager({
  cerebrasApiKey: process.env.CEREBRAS_API_KEY!,
  supabaseConfig: {
    supabaseUrl: 'https://your-project.supabase.co',
    serviceKey: process.env.SUPABASE_SERVICE_KEY!, // Read-only service key
    schema: 'public' // Optional, defaults to 'public'
  }
});

Methods

testConnection(databaseUrl?: string, supabaseConfig?: SupabaseConnectionConfig): Promise<boolean>

Tests database connectivity.

Parameters:

  • databaseUrl (optional): Database connection string. Uses constructor URL if not provided.
  • supabaseConfig (optional): Supabase configuration for secure authentication.

Returns: Promise resolving to true if connection successful.

// Using constructor URL
const isConnected = await dbManager.testConnection();

// Using specific URL
const isConnected = await dbManager.testConnection('postgresql://user:pass@host:5432/db');

// Using Supabase config (secure)
const isConnected = await dbManager.testConnection(null, {
  supabaseUrl: 'https://your-project.supabase.co',
  serviceKey: 'your-service-key'
});

getSchemaInfo(databaseUrl?: string, supabaseConfig?: SupabaseConnectionConfig): Promise<FormattedSchema>

Extracts database schema information.

Parameters:

  • databaseUrl (optional): Database connection string. Uses constructor URL if not provided.
  • supabaseConfig (optional): Supabase configuration for secure authentication.

Returns: Promise resolving to formatted schema object.

// Using constructor URL
const schema = await dbManager.getSchemaInfo();
console.log(`Found ${schema.tables.length} tables`);

// Using Supabase config (secure)
const schema = await dbManager.getSchemaInfo(null, {
  supabaseUrl: 'https://your-project.supabase.co',
  serviceKey: 'your-service-key',
  schema: 'public' // optional
});

schema.tables.forEach(table => {
  console.log(`Table: ${table.name}`);
  console.log(`Columns: ${table.columns.map(c => c.name).join(', ')}`);
});

analyzeAndCacheSchema(databaseUrl: string, databaseId: string, forceAnalysis?: boolean): Promise<AnalyzedSchema | null>

Performs AI-powered schema analysis with detailed descriptions. Now includes automatic chunking to handle large schemas without token overload.

Parameters:

  • databaseUrl (required): Database connection string
  • databaseId (required): Unique identifier for the database
  • forceAnalysis (optional): Force new analysis, skip cache (default: false)

Returns: Promise resolving to analyzed schema with AI-generated descriptions.

🚀 Chunking Features:

  • Automatic detection: Schemas > 8 tables or > 6000 tokens are automatically chunked
  • Smart batching: Processes 8 tables per API call to stay within token limits
  • Progress tracking: Shows chunk progress in console logs
  • Graceful fallbacks: If a chunk fails, continues with remaining chunks
  • Token estimation: Calculates approximate tokens before API calls
const analyzed = await dbManager.analyzeAndCacheSchema(
  'postgresql://user:pass@localhost:5432/mydb',
  'production-db',
  false // Use cache if available
);

if (analyzed) {
  // Display overall schema summary (automatically generated)
  console.log('Overall Summary:', analyzed.overallSummary);
  
  analyzed.tables.forEach(table => {
    console.log(`Table: ${table.table_name}`);
    console.log(`Description: ${table.description}`);
    
    table.columns.forEach(column => {
      console.log(`  ${column.column_name}: ${column.description}`);
    });
  });
}

// For large databases (50+ tables), you'll see chunking in action:
// 📦 Chunking 52 tables into batches of 8...
// 📊 Processing 7 chunks...
// ⏳ Processing chunk 1/7 (8 tables)...
// ✅ Chunk 1 completed: 8 tables analyzed
// ... (continues for all chunks)

getOverallSchemaSummary(analyzedSchema: AnalyzedSchema): Promise<string>

Generates a concise, high-level summary of the entire database schema using AI.

Parameters:

  • analyzedSchema (required): The analyzed schema object from analyzeAndCacheSchema()

Returns: Promise resolving to a one-paragraph summary describing the database's overall purpose, table count, and core tables.

// First, get the analyzed schema
const analyzed = await dbManager.analyzeAndCacheSchema(
  'postgresql://user:pass@localhost:5432/ecommerce',
  'ecommerce-db'
);

// Generate a standalone summary (if you need it separately)
const summary = await dbManager.getOverallSchemaSummary(analyzed);
console.log('Database Summary:', summary);

// Note: The summary is also automatically included in the analyzed schema
// as analyzed.overallSummary when you call analyzeAndCacheSchema()

validateConnectionString(connectionString: string)

Validates connection string format and determines database type.

Parameters:

  • connectionString (required): Connection string to validate

Returns: Validation result object.

const validation = dbManager.validateConnectionString('postgresql://user:pass@host:5432/db');
console.log('Valid:', validation.isValid);
console.log('Type:', validation.type);
console.log('Errors:', validation.errors);

getSupportedDatabaseTypes(): string[]

Returns array of supported database types.

const supported = dbManager.getSupportedDatabaseTypes();
console.log('Supported:', supported); // ['postgres', 'postgresql', 'mysql', 'mariadb', 'mongodb', 'sqlite']

getDatabaseUrlType(databaseUrl?: string): string | null

Automatically detects and returns the database type from a connection URL.

Parameters:

  • databaseUrl (optional): Database connection string. Uses constructor URL if not provided.

Returns: Database type string ('postgres', 'mysql', 'mongodb', 'sqlite', etc.) or null if unsupported.

// Using constructor URL
const dbManager = new DatabaseManager({
  cerebrasApiKey: 'your-key',
  databaseUrl: 'postgresql://user:pass@localhost:5432/mydb'
});

const type = dbManager.getDatabaseUrlType();
console.log('Database type:', type); // 'postgres'

// Using specific URL
const mysqlType = dbManager.getDatabaseUrlType('mysql://user:pass@localhost:3306/mydb');
console.log('Database type:', mysqlType); // 'mysql'

const mongoType = dbManager.getDatabaseUrlType('mongodb://localhost:27017/mydb');
console.log('Database type:', mongoType); // 'mongodb'

const sqliteType = dbManager.getDatabaseUrlType('./database.db');
console.log('Database type:', sqliteType); // 'sqlite'

Supabase Secure Authentication

For enhanced security with Supabase databases, use the config-based authentication instead of database URLs:

Why Use Supabase Config?

Security Benefits:

  • ✅ No database credentials exposed
  • ✅ Users control permissions via Supabase dashboard
  • ✅ Easy to revoke/rotate service keys
  • ✅ No need to share database passwords
  • ✅ Granular access control through API keys

Setup Steps

  1. Get your Supabase URL:

    https://your-project.supabase.co
  2. Create a service role key:

    • Go to Supabase Dashboard → Settings → API
    • Copy the service_role key (starts with eyJ...)
    • Or create a custom key with read-only permissions
  3. Use config instead of URL:

// ❌ Old way (less secure)
const dbManager = new DatabaseManager({
  cerebrasApiKey: 'your-key',
  databaseUrl: 'postgresql://postgres:[email protected]:5432/postgres'
});

// ✅ New way (secure)
const dbManager = new DatabaseManager({
  cerebrasApiKey: 'your-key',
  supabaseConfig: {
    supabaseUrl: 'https://your-project.supabase.co',
    serviceKey: process.env.SUPABASE_SERVICE_KEY!, // Much safer!
    schema: 'public' // optional
  }
});

Supabase Methods

configureSupabase(config: SupabaseConnectionConfig): void

Configure Supabase after initialization:

const manager = new DatabaseManager({ cerebrasApiKey: 'your-key' });

manager.configureSupabase({
  supabaseUrl: 'https://your-project.supabase.co',
  serviceKey: 'your-service-key'
});

validateSupabaseConfig(config: SupabaseConnectionConfig)

Validate Supabase configuration without connecting:

const validation = manager.validateSupabaseConfig({
  supabaseUrl: 'https://your-project.supabase.co',
  serviceKey: 'your-service-key'
});

console.log('Valid:', validation.isValid);
console.log('Errors:', validation.errors);

isSupabaseConfigured(): boolean

Check if Supabase is configured:

if (manager.isSupabaseConfigured()) {
  console.log('✅ Supabase is ready to use');
  const schema = await manager.getSchemaInfo(); // Uses configured Supabase
}

Environment Variables for Supabase

CEREBRAS_API_KEY=your_cerebras_api_key_here
SUPABASE_URL=https://your-project.supabase.co
SUPABASE_SERVICE_KEY=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9...

Connection Strings

PostgreSQL

postgresql://username:password@hostname:port/database
postgres://username:password@hostname:port/database

MySQL/MariaDB

mysql://username:password@hostname:port/database
mariadb://username:password@hostname:port/database

SQLite

sqlite:///path/to/database.db
./database.db
/absolute/path/to/database.db

MongoDB

mongodb://username:password@hostname:port/database
mongodb+srv://username:[email protected]/database

Environment Variables

Create a .env file in your project root:

CEREBRAS_API_KEY=your_cerebras_api_key_here
DATABASE_URL=postgresql://user:pass@localhost:5432/mydb

Error Handling

The library provides enhanced error messages for common issues:

try {
  await dbManager.testConnection('invalid://connection/string');
} catch (error) {
  console.error('Connection failed:', error.message);
  // Enhanced error messages like:
  // - "Database host not found. Please verify the hostname..."
  // - "Authentication failed. Please verify your username and password."
  // - "Connection refused. Please check if the database server is running..."
}

TypeScript Support

Full TypeScript support with exported interfaces:

import DatabaseManager, { DbManagerOptions } from '@bhagat-surya-dev/dashchat-database-manager';

interface AnalyzedSchema {
  tables: AnalyzedTable[];
  overallSummary?: string; // AI-generated database summary
}

interface AnalyzedTable {
  table_name: string;
  description: string;
  columns: AnalyzedColumn[];
}

interface AnalyzedColumn {
  column_name: string;
  description: string;
  data_type: string;
  nullable: boolean;
}

Smart Chunking System (NEW)

Prevents LLM token overload for large database schemas with automatic chunking and optimization.

How Chunking Works

  1. Token Estimation: Calculates approximate tokens before sending to AI
  2. Automatic Detection: Triggers chunking when:
    • Schema has > 8 tables, OR
    • Estimated tokens > 6000
  3. Smart Batching: Splits tables into chunks of 8 tables each
  4. Sequential Processing: Processes chunks one by one with small delays
  5. Result Combination: Merges all chunk results into final schema

Chunking in Action

// For a large e-commerce database with 52 tables:
const analyzed = await dbManager.analyzeAndCacheSchema(
  'postgresql://user:pass@localhost:5432/large_ecommerce',
  'large-ecommerce-db'
);

// Console output shows chunking process:
// 📊 Estimated tokens for full schema: 15,240
// 📦 Schema is large, using chunking strategy  
// 📦 Chunking 52 tables into batches of 8...
// 📊 Processing 7 chunks...
// ⏳ Processing chunk 1/7 (8 tables)...
// ✅ Chunk 1 completed: 8 tables analyzed
// ⏳ Processing chunk 2/7 (8 tables)...
// ✅ Chunk 2 completed: 8 tables analyzed
// ... continues for all 7 chunks
// 🎉 Chunked analysis completed: 52 total tables analyzed

Benefits

  • No more token limit errors 📈
  • Handles any schema size (tested with 100+ tables)
  • Maintains analysis quality per chunk
  • Graceful error recovery if individual chunks fail
  • Performance optimized with 500ms delays between chunks

Schema Summary Feature

The library now automatically generates an overall database summary when performing AI analysis:

Automatic Summary Generation

When you call analyzeAndCacheSchema(), the system automatically:

  1. Analyzes each table and column with detailed AI-generated descriptions
  2. Generates an overall summary describing the database's purpose and key tables
  3. Includes the summary in the returned schema object as overallSummary

Example Usage

const analyzed = await dbManager.analyzeAndCacheSchema(
  'mongodb://admin:password123@localhost:27017/ecommerce_startup',
  'ecommerce-startup-db'
);

// The summary is automatically available
console.log('📋 Database Summary:');
console.log(analyzed.overallSummary);
// Output: "This database supports a comprehensive e-commerce platform with 8 tables..."

// Save analysis and summary to files
import * as fs from 'fs/promises';

// Save full analysis
await fs.writeFile('schema_analysis.json', JSON.stringify(analyzed, null, 2));

// Save summary report
const summaryReport = `Database Schema Summary Report
Generated: ${new Date().toISOString()}
Total Tables: ${analyzed.tables.length}

SUMMARY:
${analyzed.overallSummary}

TABLES:
${analyzed.tables.map(table => 
  `- ${table.table_name}: ${table.description}`
).join('\n')}`;

await fs.writeFile('schema_summary.txt', summaryReport);

Summary Content

The AI-generated summary includes:

  • Database purpose (e.g., "e-commerce platform", "CRM system")
  • Total number of tables
  • Core/important tables identified
  • Overall architecture description

Recent Updates

Supabase Schema Extraction Fix (v1.0.11)

Issue Resolved: Schema extraction was failing with "public.information_schema.tables does not exist" error.

🔧 Core Fixes Implemented

  1. Fixed Connection Testing

    • Before: Used supabase.rpc('version') which doesn't exist
    • After: Uses supabase.from('pg_tables') for reliable connection validation
    • Result: ✅ Connection testing now works consistently
  2. Implemented PostgREST OpenAPI Discovery

    • Before: Tried information_schema.tables (not available via PostgREST)
    • After: Uses GET /rest/v1/ OpenAPI endpoint to discover all accessible tables
    • Result: ✅ Now finds all tables dynamically (no hardcoding needed)
  3. Enhanced Schema Extraction with Multi-Tier Approach

    • Tier 1: Try auto-created RPC function for optimal performance
    • Tier 2: Fall back to PostgREST OpenAPI discovery
    • Tier 3: Extract column info from sample data
    • Result: ✅ Robust extraction that works in various permission scenarios

📊 Test Results

✅ Connection Testing: Works consistently
✅ Table Discovery: 14 tables found via OpenAPI
✅ AI Analysis: Full descriptions generated
✅ Schema Extraction: Complete end-to-end functionality

Sample AI Output:
"This database appears to be designed for a multi-tenant 
SaaS platform with user management and organization 
configuration capabilities..."

Tables with AI Descriptions:
- users: "Stores information about registered users..."
- feedback: "Stores user-submitted feedback..."
- admin_databases: "Stores information about administrative databases..."

Before: Schema extraction failed completely
After: Full schema discovery + AI analysis working end-to-end

Features

  • ✅ Multi-database support (PostgreSQL, MySQL, SQLite, MongoDB)
  • Secure Supabase authentication with URL + Service Key (no database credentials needed)
  • Fixed Supabase schema extraction with PostgREST OpenAPI discovery
  • ✅ AI-powered schema analysis with Cerebras
  • Automatic schema summary generation
  • Smart chunking system to prevent LLM token overload for large schemas
  • Token estimation and automatic optimization
  • Batch processing for databases with 50+ tables
  • ✅ Comprehensive error handling
  • ✅ Connection validation
  • ✅ TypeScript support
  • ✅ Performance monitoring
  • ✅ Detailed logging
  • ✅ Schema optimization for AI accuracy
  • ✅ Isolated Supabase manager for easier debugging

Requirements

  • Node.js >= 16.0.0
  • Valid Cerebras API key for AI analysis
  • Appropriate database driver for your database type

License

MIT

Support

For issues and feature requests, please visit the GitHub repository.