npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

masterrecord

v0.3.39

Published

An Object-relational mapping for the Master framework. Master Record connects classes to relational database tables to establish a database with almost zero-configuration

Readme

MasterRecord

npm version License: MIT

MasterRecord is a lightweight, code-first ORM for Node.js with a fluent query API, comprehensive migrations, and multi-database support. Build type-safe queries with lambda expressions, manage schema changes with CLI-driven migrations, and work seamlessly across MySQL, PostgreSQL, and SQLite.

Key Features

🔹 Multi-Database Support - MySQL, PostgreSQL, SQLite with consistent API 🔹 Code-First Design - Define entities in JavaScript, generate schema automatically 🔹 Fluent Query API - Lambda-based queries with parameterized placeholders 🔹 Active Record Pattern - Entities with .save(), .delete(), .reload() methods 🔹 Entity Serialization - .toObject() and .toJSON() with circular reference protection 🔹 Lifecycle Hooks - beforeSave, afterSave, beforeDelete, afterDelete hooks 🔹 Business Validation - Built-in validators (required, email, length, pattern, custom) 🔹 Bulk Operations - Efficient bulkCreate, bulkUpdate, bulkDelete APIs 🔹 Query Result Caching - Production-grade in-memory and Redis caching with automatic invalidation 🔹 Migration System - CLI-driven migrations with rollback support 🔹 SQL Injection Protection - Automatic parameterized queries throughout 🔹 Field Transformers - Custom serialization/deserialization for complex types 🔹 Type Validation - Runtime type checking and coercion 🔹 Relationship Mapping - One-to-many, many-to-one, many-to-many support 🔹 Seed Data - Built-in seeding with idempotent operations

Database Support

| Database | Version | Features | |------------|--------------|---------------------------------------------------| | PostgreSQL | 9.6+ (12+) | JSONB, UUID, async/await, connection pooling | | MySQL | 5.7+ (8.0+) | JSON, async/await, connection pooling, AUTO_INCREMENT | | SQLite | 3.x | Embedded, zero-config, file-based, async API wrapper |

Table of Contents


⚠️ Best Practices (CRITICAL)

1. Creating Entity Instances

ALWAYS use context.Entity.new() to create new entity instances:

// ✅ CORRECT - Creates proper data instance with getters/setters
const task = this._qaContext.QaTask.new();
const annotation = this._qaContext.QaAnnotation.new();
const project = this._qaContext.QaProject.new();

task.name = "My Task";
task.status = "active";
await db.saveChanges();  // ✅ Saves correctly

// ❌ WRONG - Creates schema definition object with function properties
const task = new QaTask();  // task.name is a FUNCTION, not a property!

task.name = "My Task";  // ❌ Doesn't work - name is a function
await db.saveChanges();  // ❌ Error: "Type mismatch: Expected string, got function"

Why?

  • new Entity() creates a schema definition object where properties are methods that define the schema
  • context.Entity.new() creates a data instance with proper getters/setters for storing values
  • Using new Entity() causes runtime errors: "Type mismatch for Entity.field: Expected integer, got function with value undefined"

Error Example:

Error: INSERT failed: Type mismatch for QaTask.name: Expected string, got function with value undefined
    at SQLLiteEngine._buildSQLInsertObjectParameterized

This error means: You used new Entity() instead of context.Entity.new()

2. Saving Changes - ALWAYS use await

ALWAYS use await when calling saveChanges():

// ✅ CORRECT - Waits for database write to complete
await this._qaContext.saveChanges();

// ❌ WRONG - Returns immediately without waiting for database write
this._qaContext.saveChanges();  // Promise never completes!

Why?

  • saveChanges() is async and returns a Promise
  • Without await, code continues before database write completes
  • Causes data loss - appears successful but nothing saves to database
  • Results in "phantom saves" - data in memory but not persisted

Symptoms of missing await:

  • API returns success but data not in database
  • Queries after save return old/missing data
  • Intermittent save failures
  • Race conditions

Repository Pattern - Make Methods Async:

// ✅ CORRECT - Async method with await
async create(entity) {
    this._qaContext.Entity.add(entity);
    await this._qaContext.saveChanges();
    return entity;
}

// ❌ WRONG - Synchronous method calling async saveChanges
create(entity) {
    this._qaContext.Entity.add(entity);
    this._qaContext.saveChanges();  // No await - returns before save completes!
    return entity;  // Returns entity with undefined ID
}

3. Quick Reference Card

// Entity Creation
✅ const user = db.User.new();           // CORRECT
❌ const user = new User();              // WRONG - creates schema object

// Saving Data
✅ await db.saveChanges();               // CORRECT - waits for completion
❌ db.saveChanges();                     // WRONG - fire and forget

// Repository Methods
✅ async create(entity) {                // CORRECT - async method
      await db.saveChanges();
   }
❌ create(entity) {                      // WRONG - sync method
      db.saveChanges();                  // No await!
   }

// Querying (all require await)
✅ const users = await db.User.toList();  // CORRECT
✅ const user = await db.User.findById(1); // CORRECT
❌ const users = db.User.toList();         // WRONG - returns Promise

Installation

# Global installation (recommended for CLI)
npm install -g masterrecord

# Local installation
npm install masterrecord

# With specific database drivers
npm install masterrecord pg           # PostgreSQL
npm install masterrecord mysql2       # MySQL
npm install masterrecord better-sqlite3  # SQLite

Dependencies

MasterRecord includes the following database drivers by default:

  • pg@^8.17.2 - PostgreSQL (async)
  • mysql2@^3.11.5 - MySQL (async with connection pooling)
  • better-sqlite3@^12.6.2 - SQLite (async API wrapper for consistency)

Two Patterns: Entity Framework & Active Record

MasterRecord supports both ORM patterns - choose what feels natural:

Active Record Style (Recommended for beginners)

// Entity saves itself
const user = db.User.findById(1);
user.name = 'Updated';
await user.save();  // ✅ Entity knows how to save

Entity Framework Style (Efficient for batch operations)

// Context saves all tracked entities
const user = db.User.findById(1);
user.name = 'Updated';
await db.saveChanges();  // ✅ Batch save

Read more: Active Record Pattern Guide | Detached Entities Guide


Quick Start

1. Create a Context

// app/models/context.js
const context = require('masterrecord/context');
const User = require('./User');
const Post = require('./Post');

class AppContext extends context {
    constructor() {
        super();

        // Configure database connection
        this.env({
            type: 'postgres',  // or 'mysql', 'sqlite'
            host: 'localhost',
            port: 5432,
            database: 'myapp',
            user: 'postgres',
            password: 'password'
        });

        // Register entities
        this.dbset(User);
        this.dbset(Post);
    }
}

module.exports = AppContext;

2. Define Entities

// app/models/User.js
class User {
    constructor() {
        this.id = { type: 'integer', primary: true, auto: true };
        this.name = { type: 'string', nullable: false };
        this.email = { type: 'string', nullable: false, unique: true };
        this.age = { type: 'integer', nullable: true };
        this.created_at = { type: 'timestamp', default: 'CURRENT_TIMESTAMP' };
    }
}

module.exports = User;

3. Run Migrations

# Enable migrations (one-time setup)
masterrecord enable-migrations AppContext

# Create initial migration
masterrecord add-migration InitialCreate AppContext

# Apply migrations
masterrecord update-database AppContext

4. Query Your Data

const AppContext = require('./app/models/context');
const db = new AppContext();

// Create (Active Record style)
const user = db.User.new();
user.name = 'Alice';
user.email = '[email protected]';
user.age = 28;
await user.save();  // Entity saves itself!

// Read with parameterized query
const alice = db.User
    .where(u => u.email == $$, '[email protected]')
    .single();

// Update (Active Record style)
alice.age = 29;
await alice.save();  // Entity saves itself!

// Delete
db.remove(alice);
await db.saveChanges();

Database Configuration

PostgreSQL (Async)

class AppContext extends context {
    constructor() {
        super();

        this.env({
            type: 'postgres',
            host: 'localhost',
            port: 5432,
            database: 'myapp',
            user: 'postgres',
            password: 'password',
            max: 20,  // Connection pool size
            idleTimeoutMillis: 30000,
            connectionTimeoutMillis: 2000
        });

        this.dbset(User);
    }
}

// Usage requires await
const db = new AppContext();
await db.saveChanges();  // PostgreSQL is async

MySQL (Async with Connection Pooling)

class AppContext extends context {
    constructor() {
        super();

        this.env({
            type: 'mysql',
            host: 'localhost',
            port: 3306,
            database: 'myapp',
            user: 'root',
            password: 'password',
            connectionLimit: 10  // Connection pool size (optional)
        });

        this.dbset(User);
    }
}

// Usage requires await (async like PostgreSQL)
const db = new AppContext();
await db.saveChanges();  // MySQL now uses async/await

SQLite (Async API)

class AppContext extends context {
    constructor() {
        super();

        this.env({
            type: 'sqlite',
            connection: './data/myapp.db'  // File path
        });

        this.dbset(User);
    }
}

// Usage requires await for consistency across databases
const db = new AppContext();
await db.saveChanges();  // SQLite now has async API wrapper

Environment Files

Store configurations in JSON files:

// config/environments/env.development.json
{
    "type": "postgres",
    "host": "localhost",
    "port": 5432,
    "database": "myapp_dev",
    "user": "postgres",
    "password": "dev_password"
}
// Load environment file
class AppContext extends context {
    constructor() {
        super();
        this.env('config/environments');  // Loads env.<NODE_ENV>.json
        this.dbset(User);
    }
}
# Set environment
export NODE_ENV=development
node app.js

Entity Definitions

Basic Entity

class User {
    constructor() {
        // Primary key with auto-increment
        this.id = {
            type: 'integer',
            primary: true,
            auto: true
        };

        // Required string field
        this.name = {
            type: 'string',
            nullable: false
        };

        // Optional field with default
        this.status = {
            type: 'string',
            nullable: true,
            default: 'active'
        };

        // Unique constraint
        this.email = {
            type: 'string',
            unique: true
        };

        // Timestamp
        this.created_at = {
            type: 'timestamp',
            default: 'CURRENT_TIMESTAMP'
        };
    }
}

Field Types

| MasterRecord Type | PostgreSQL | MySQL | SQLite | |-------------------|---------------|---------------|-----------| | integer | INTEGER | INT | INTEGER | | bigint | BIGINT | BIGINT | INTEGER | | string | VARCHAR(255) | VARCHAR(255) | TEXT | | text | TEXT | TEXT | TEXT | | float | REAL | FLOAT | REAL | | decimal | DECIMAL | DECIMAL | REAL | | boolean | BOOLEAN | TINYINT | INTEGER | | date | DATE | DATE | TEXT | | time | TIME | TIME | TEXT | | datetime | TIMESTAMP | DATETIME | TEXT | | timestamp | TIMESTAMP | TIMESTAMP | TEXT | | json | JSON | JSON | TEXT | | jsonb | JSONB | JSON | TEXT | | uuid | UUID | VARCHAR(36) | TEXT | | binary | BYTEA | BLOB | BLOB |

Relationships

class User {
    constructor() {
        this.id = { type: 'integer', primary: true, auto: true };
        this.name = { type: 'string' };

        // One-to-many: User has many Posts
        this.Posts = {
            type: 'hasMany',
            model: 'Post',
            foreignKey: 'user_id'
        };
    }
}

class Post {
    constructor() {
        this.id = { type: 'integer', primary: true, auto: true };
        this.title = { type: 'string' };
        this.user_id = { type: 'integer' };

        // Many-to-one: Post belongs to User
        this.User = {
            type: 'belongsTo',
            model: 'User',
            foreignKey: 'user_id'
        };
    }
}

Field Transformers

Store complex JavaScript types in simple database columns:

class User {
    constructor() {
        this.id = { type: 'integer', primary: true, auto: true };

        // Store arrays as JSON strings
        this.tags = {
            type: 'string',
            transform: {
                toDatabase: (value) => {
                    return Array.isArray(value) ? JSON.stringify(value) : value;
                },
                fromDatabase: (value) => {
                    return value ? JSON.parse(value) : [];
                }
            }
        };
    }
}

// Usage is natural
const user = db.User.new();
user.tags = ['admin', 'moderator'];  // Assign array
await db.saveChanges();  // Stored as '["admin","moderator"]'

const loaded = await db.User.findById(user.id);
console.log(loaded.tags);  // ['admin', 'moderator'] - JavaScript array!

Querying

Basic Queries

// Find all (requires await)
const users = await db.User.toList();

// Find by primary key (requires await)
const user = await db.User.findById(123);

// Find single with where clause (requires await)
const alice = await db.User
    .where(u => u.email == $$, '[email protected]')
    .single();

// Find multiple with conditions (requires await)
const adults = await db.User
    .where(u => u.age >= $$, 18)
    .toList();

Parameterized Queries

Always use $$ placeholders for SQL injection protection:

// Single parameter (requires await)
const user = await db.User.where(u => u.id == $$, 123).single();

// Multiple parameters (requires await)
const results = await db.User
    .where(u => u.age > $$ && u.status == $$, 25, 'active')
    .toList();

// Single $ for OR conditions (requires await)
const results = await db.User
    .where(u => u.status == $ || u.status == null, 'active')
    .toList();

IN Clauses

// Array parameter with .includes() (requires await)
const ids = [1, 2, 3, 4, 5];
const users = await db.User
    .where(u => $$.includes(u.id), ids)
    .toList();

// Generated SQL: WHERE id IN ($1, $2, $3, $4, $5)
// PostgreSQL parameters: [1, 2, 3, 4, 5]

// Alternative .any() syntax (requires await)
const users = await db.User
    .where(u => u.id.any($$), [1, 2, 3])
    .toList();

// Comma-separated strings (auto-splits) (requires await)
const users = await db.User
    .where(u => u.id.any($$), "1,2,3,4,5")
    .toList();

Query Chaining

let query = db.User;

// Build query dynamically
if (searchTerm) {
    query = query.where(u => u.name.like($$), `%${searchTerm}%`);
}

if (minAge) {
    query = query.where(u => u.age >= $$, minAge);
}

// Add sorting and pagination (requires await)
const users = await query
    .orderBy(u => u.created_at)
    .skip(offset)
    .take(limit)
    .toList();

Ordering

// Ascending (requires await)
const users = await db.User
    .orderBy(u => u.name)
    .toList();

// Descending (requires await)
const users = await db.User
    .orderByDescending(u => u.created_at)
    .toList();

Pagination

// Skip 20, take 10 (requires await)
const users = await db.User
    .orderBy(u => u.id)
    .skip(20)
    .take(10)
    .toList();

// Page-based pagination (requires await)
const page = 2;
const pageSize = 10;
const users = await db.User
    .skip(page * pageSize)
    .take(pageSize)
    .toList();

Counting

// Count all (requires await)
const total = await db.User.count();

// Count with conditions (requires await)
const activeCount = await db.User
    .where(u => u.status == $$, 'active')
    .count();

Complex Queries

// Multiple conditions with OR (requires await)
const results = await db.User
    .where(u => (u.status == 'active' || u.status == 'pending') && u.age >= $$, 18)
    .orderBy(u => u.name)
    .toList();

// Nullable checks (requires await)
const usersWithoutEmail = await db.User
    .where(u => u.email == null)
    .toList();

// LIKE queries (requires await)
const matching = await db.User
    .where(u => u.name.like($$), '%john%')
    .toList();

Migrations

CLI Commands

# Enable migrations (one-time per context)
masterrecord enable-migrations AppContext

# Create a migration
masterrecord add-migration MigrationName AppContext

# Apply migrations
masterrecord update-database AppContext

# List migrations
masterrecord get-migrations AppContext

# Multi-context commands
masterrecord enable-migrations-all          # Enable for all contexts
masterrecord add-migration-all Init         # Create migration for all
masterrecord update-database-all            # Apply all pending migrations

Migration File Structure

// db/migrations/20250111_143052_CreateUser.js
const masterrecord = require('masterrecord');

class CreateUser extends masterrecord.schema {
    constructor(context) {
        super(context);
    }

    // IMPORTANT: Migrations must be async
    async up(table) {
        this.init(table);

        // Create table (requires await)
        await this.createTable(table.User);

        // Seed initial data
        this.seed('User', {
            name: 'Admin',
            email: '[email protected]',
            role: 'admin'
        });
    }

    async down(table) {
        this.init(table);

        // Rollback
        this.dropTable(table.User);
    }
}

module.exports = CreateUser;

Migration Operations

class MyMigration extends masterrecord.schema {
    async up(table) {
        this.init(table);

        // Create table (requires await)
        await this.createTable(table.User);

        // Add column
        schema.addColumn({
            tableName: 'User',
            name: 'phone',
            type: 'string'
        });

        // Alter column
        schema.alterColumn({
            tableName: 'User',
            table: {
                name: 'age',
                type: 'integer',
                nullable: false,
                default: 0
            }
        });

        // Rename column
        schema.renameColumn({
            tableName: 'User',
            name: 'old_name',
            newName: 'new_name'
        });

        // Drop column
        schema.dropColumn({
            tableName: 'User',
            name: 'deprecated_field'
        });

        // Drop table
        schema.dropTable(table.OldTable);
    },

    down: function(table, schema) {
        // Reverse operations
    }
};

Seed Data

module.exports = {
    up: function(table, schema) {
        schema.createTable(table.User);

        // Single record
        schema.seed('User', {
            name: 'Admin',
            email: '[email protected]'
        });

        // Multiple records (efficient bulk insert)
        schema.bulkSeed('User', [
            { name: 'Alice', email: '[email protected]', age: 25 },
            { name: 'Bob', email: '[email protected]', age: 30 },
            { name: 'Charlie', email: '[email protected]', age: 35 }
        ]);
    },

    down: function(table, schema) {
        schema.dropTable(table.User);
    }
};

Seed data is idempotent - re-running migrations won't create duplicates:

  • SQLite: INSERT OR IGNORE
  • MySQL: INSERT IGNORE
  • PostgreSQL: INSERT ... ON CONFLICT DO NOTHING

Advanced Features

Type Validation

MasterRecord validates and coerces field types at runtime:

const user = db.User.new();
user.age = "25";  // String assigned to integer field
await db.saveChanges();
// ⚠️  Console: Auto-converting string "25" to integer 25

user.age = "invalid";
await db.saveChanges();
// ❌ Error: Field User.age must be an integer, got string "invalid"

Field Transformers (Advanced)

class Post {
    constructor() {
        this.id = { type: 'integer', primary: true, auto: true };

        // Store array as JSON
        this.tags = {
            type: 'string',
            transform: {
                toDatabase: (v) => Array.isArray(v) ? JSON.stringify(v) : v,
                fromDatabase: (v) => v ? JSON.parse(v) : []
            }
        };

        // PostgreSQL JSONB (native JSON support)
        this.metadata = {
            type: 'jsonb',  // PostgreSQL only
            transform: {
                toDatabase: (v) => JSON.stringify(v || {}),
                fromDatabase: (v) => typeof v === 'string' ? JSON.parse(v) : v
            }
        };
    }
}

Table Prefixes

Useful for multi-tenant applications or plugin systems:

class AppContext extends context {
    constructor() {
        super();

        this.tablePrefix = 'myapp_';  // Set before dbset()
        this.env('config/environments');

        this.dbset(User);  // Creates table: myapp_User
        this.dbset(Post);  // Creates table: myapp_Post
    }
}

Transactions (PostgreSQL)

const { PostgresSyncConnect } = require('masterrecord/postgresSyncConnect');

const connection = new PostgresSyncConnect();
await connection.connect(config);

const result = await connection.transaction(async (client) => {
    // Insert user
    const userResult = await client.query(
        'INSERT INTO User (name, email) VALUES ($1, $2) RETURNING id',
        ['Alice', '[email protected]']
    );

    // Insert related record
    await client.query(
        'INSERT INTO Profile (user_id, bio) VALUES ($1, $2)',
        [userResult.rows[0].id, 'Software Engineer']
    );

    return userResult.rows[0].id;
});

// Automatically commits on success, rolls back on error

Query Result Caching

MasterRecord includes a production-grade two-level caching system similar to Entity Framework and Hibernate. The cache dramatically improves performance by storing query results and automatically invalidating them when data changes.

How It Works

┌─────────────────────────────────────────────────────┐
│              First-Level Cache (Identity Map)       │
│  - Request-scoped entity tracking                   │
│  - O(1) entity lookup                               │
│  - Already in MasterRecord                          │
└─────────────────────────────────────────────────────┘
                       ▼
┌─────────────────────────────────────────────────────┐
│       Second-Level Cache (Query Result Cache)       │
│  - Application-wide query result storage            │
│  - Automatic invalidation on data changes           │
│  - In-memory (development) or Redis (production)    │
└─────────────────────────────────────────────────────┘

Basic Usage (Opt-In, Request-Scoped)

Caching is opt-in and request-scoped like Active Record. Use .cache() to enable caching, and call endRequest() to clear:

const db = new AppContext();

// DEFAULT: No caching (always hits database)
const user = db.User.findById(1);  // DB query
const user2 = db.User.findById(1);  // DB query again (no cache)

// OPT-IN: Enable caching with .cache()
const categories = await db.Categories.cache().toList();  // DB query, cached
const categories2 = await db.Categories.cache().toList();  // Cache hit! (instant)

// Update invalidates cache automatically
const cat = await db.Categories.findById(1);
cat.name = "Updated";
await db.saveChanges();  // Cache for Categories table cleared

// End request (clears cache - like Active Record)
db.endRequest();  // Cache cleared for next request

Web Application Pattern (Recommended):

// Express middleware - automatic request-scoped caching
app.use((req, res, next) => {
    req.db = new AppContext();

    // Clear cache when response finishes (like Active Record)
    res.on('finish', () => {
        req.db.endRequest();  // Clears query cache
    });

    next();
});

// In your routes
app.get('/categories', async (req, res) => {
    // Cache is fresh for this request
    const categories = await req.db.Categories.cache().toList();
    res.json(categories);
    // Cache auto-cleared after response
});

Configuration

Configure caching via environment variables:

# Development (.env)
QUERY_CACHE_TTL=5000               # TTL in milliseconds (5000ms = 5 seconds - request-scoped)
QUERY_CACHE_SIZE=1000              # Max cache entries (default: 1000)
QUERY_CACHE_ENABLED=true           # Enable/disable globally (default: true)

# Production (.env)
QUERY_CACHE_TTL=5                  # Redis uses seconds (5 seconds default)
REDIS_URL=redis://localhost:6379  # Use Redis for distributed caching

Note:

  • Cache is opt-in per query using .cache()
  • Default TTL is 5 seconds (request-scoped like Active Record)
  • Call db.endRequest() to clear cache manually (recommended in middleware)
  • Environment variables control the cache system globally

Enable Caching for Specific Queries

Use .cache() for frequently accessed, rarely changed data:

// DEFAULT: Always hits database (safe)
const liveData = await db.Analytics
    .where(a => a.date == $$, today)
    .toList();  // No caching (default)

// OPT-IN: Cache reference data
const categories = await db.Categories.cache().toList();  // Cached for 5 seconds (default TTL)
const settings = await db.Settings.cache().toList();  // Cached
const countries = await db.Countries.cache().toList();  // Cached

// When to use .cache():
// ✅ Reference data (categories, settings, countries)
// ✅ Rarely changing data (roles, permissions)
// ✅ Expensive aggregations with stable results
// ❌ User-specific data
// ❌ Real-time data
// ❌ Financial/critical data

Manual Cache Control

const db = new AppContext();

// Check cache performance
const stats = db.getCacheStats();
console.log(stats);
// {
//   size: 45,
//   maxSize: 1000,
//   hits: 234,
//   misses: 67,
//   hitRate: '77.74%',
//   enabled: true
// }

// Clear cache manually
db.clearQueryCache();

// Disable caching temporarily
db.setQueryCacheEnabled(false);
const freshData = await db.User.toList();
db.setQueryCacheEnabled(true);

Redis-Based Distributed Caching (Production)

For multi-process or clustered deployments, use Redis:

const redis = require('redis');
const RedisQueryCache = require('masterrecord/Cache/RedisQueryCache');

class AppContext extends context {
    constructor() {
        super();

        // Use Redis cache in production
        if (process.env.NODE_ENV === 'production' && process.env.REDIS_URL) {
            const redisClient = redis.createClient(process.env.REDIS_URL);
            this._queryCache = new RedisQueryCache(redisClient, {
                ttl: 300,  // 5 minutes (seconds for Redis)
                prefix: 'myapp:'
            });
        }
        // In-memory cache used automatically in development

        this.dbset(User);
    }
}

Benefits of Redis cache:

  • Shared across processes (horizontally scalable)
  • Pub/sub invalidation (cache stays consistent)
  • Two-level cache (L1 in-memory + L2 Redis)
  • Automatic failover to database on Redis errors

Cache Invalidation Strategy

MasterRecord automatically invalidates cache entries when data changes:

// Query with caching enabled
const categories = await db.Categories.cache().toList();  // DB query, cached

// Any modification to Categories table invalidates ALL cached Category queries
const cat = await db.Categories.findById(1);
cat.name = "Updated";
await db.saveChanges();  // Invalidates all cached Categories queries

// Next cached query hits database (fresh data)
const categoriesAgain = await db.Categories.cache().toList();  // DB query (cache cleared)

// Non-cached queries are unaffected (always fresh)
const users = await db.User.toList();  // No .cache() = always DB query

// Queries for OTHER tables' caches are unaffected
const settings = await db.Settings.cache().toList();  // Still cached (different table)

Invalidation rules:

  • INSERT invalidates all queries for that table
  • UPDATE invalidates all queries for that table
  • DELETE invalidates all queries for that table
  • Queries for other tables are not affected

Performance Impact

Expected performance improvements:

| Scenario | Without Cache | With Cache | Improvement | |----------|---------------|------------|-------------| | Single query (100 calls) | 100 DB queries | 1 DB + 99 cache | 99% faster | | List query (50 calls) | 50 DB queries | 1 DB + 49 cache | 98% faster | | Reference data (1000 calls) | 1000 DB queries | 1 DB + 999 cache | 99.9% faster | | Mixed operations | Baseline | 70-90% hit rate | 3-10x faster |

Memory usage: ~1KB per cached query (1000 entries ≈ 1MB)

Best Practices

DO use .cache():

// Reference data (rarely changes)
const categories = await db.Categories.cache().toList();
const settings = await db.Settings.cache().toList();
const countries = await db.Countries.cache().toList();

// Expensive aggregations (stable results)
const totalRevenue = await db.Orders
    .where(o => o.year == $$, 2024)
    .cache()
    .count();

DON'T use .cache():

// User-specific data (default is safe - no caching)
const user = await db.User.findById(userId);  // Always fresh

// Real-time data (default is safe)
const liveOrders = await db.Orders
    .where(o => o.status == $$, 'pending')
    .toList();  // Always fresh

// Financial transactions (default is safe)
const balance = await db.Transactions
    .where(t => t.user_id == $$, userId)
    .toList();  // Always fresh

// User-specific sensitive data (default is safe)
const permissions = await db.UserPermissions
    .where(p => p.user_id == $$, userId)
    .toList();  // Always fresh

Monitoring Cache Performance

// Log cache stats periodically
setInterval(() => {
    const stats = db.getCacheStats();
    console.log(`Cache: ${stats.hitRate} hit rate, ${stats.size}/${stats.maxSize} entries`);
}, 60000);

// Watch for low hit rates (< 50% might indicate poor cache strategy)
if (parseFloat(stats.hitRate) < 50) {
    console.warn('Cache hit rate is low, consider tuning cache TTL or size');
}

Request-Scoped Caching (Like Active Record)

MasterRecord's caching is designed to work like Active Record - cache within a request, clear after:

// Express middleware pattern (recommended)
app.use((req, res, next) => {
    req.db = new AppContext();

    // Automatically clear cache when request ends
    res.on('finish', () => {
        req.db.endRequest();  // Like Active Record's cache clearing
    });

    next();
});

// In routes - cache is fresh per request
app.get('/api/categories', async (req, res) => {
    // First call in this request - DB query
    const categories = await req.db.Categories.cache().toList();

    // Second call in same request - cache hit
    const categoriesAgain = await req.db.Categories.cache().toList();

    res.json(categories);
    // After response, cache is automatically cleared
});

// Next request starts with empty cache (fresh)

Why request-scoped?

  • ✅ Like Active Record - familiar pattern
  • ✅ No stale data across requests
  • ✅ Cache only lives during request processing
  • ✅ Automatic cleanup

Important: Shared Cache Behavior

The cache is shared across all context instances of the same class. This ensures consistency within a request:

const db1 = new AppContext();
const db2 = new AppContext();

// Context 1: Cache data with .cache()
const categories1 = await db1.Categories.cache().toList();  // DB query, cached

// Context 2: Sees cached data
const categories2 = await db2.Categories.cache().toList();  // Cache hit!

// Context 2: Updates invalidate cache for BOTH contexts
const cat = await db2.Categories.findById(1);
cat.name = "Updated";
await db2.saveChanges();  // Invalidates shared cache

// Context 1: Sees fresh data
const categories3 = await db1.Categories.cache().toList();  // Cache miss, fresh data
console.log(categories3[0].name);  // "Updated"

Why shared cache?

  • ✅ Prevents stale data across multiple context instances
  • ✅ Ensures all parts of your application see consistent data
  • ✅ Reduces memory usage (one cache instead of many)
  • ✅ Correct behavior for single-database applications (most use cases)

Multi-Context Applications

Manage multiple databases in one application:

// contexts/userContext.js
class UserContext extends context {
    constructor() {
        super();
        this.env({ type: 'postgres', database: 'users_db', ... });
        this.dbset(User);
        this.dbset(Profile);
    }
}

// contexts/analyticsContext.js
class AnalyticsContext extends context {
    constructor() {
        super();
        this.env({ type: 'postgres', database: 'analytics_db', ... });
        this.dbset(Event);
        this.dbset(Metric);
    }
}

// Usage
const userDb = new UserContext();
const analyticsDb = new AnalyticsContext();

const user = await userDb.User.findById(123);
const event = analyticsDb.Event.new();
event.log('user_login', user.id);
await analyticsDb.saveChanges();
# Migrate all contexts at once
masterrecord update-database-all

Raw SQL Queries

When you need full control:

// ⚠️ Advanced: Direct SQL execution (using internal API)
// For complex queries not supported by the query builder
// Note: This is an internal API. Prefer using the query builder when possible.

// PostgreSQL parameterized query
const users = await db._SQLEngine.exec(
    'SELECT * FROM "User" WHERE age > $1 AND status = $2',
    [25, 'active']
);

// MySQL parameterized query
const users = db._SQLEngine.exec(
    'SELECT * FROM User WHERE age > ? AND status = ?',
    [25, 'active']
);

API Reference

Context Methods

// Entity registration
context.dbset(EntityClass)
context.dbset(EntityClass, 'custom_table_name')

// Save changes (all databases now async)
await context.saveChanges()  // PostgreSQL, MySQL, SQLite (all async)

// Add/Remove entities
context.EntityName.add(entity)
context.remove(entity)

// Attach detached entities (like Entity Framework's Update())
context.attach(entity)                        // Attach and mark as modified
context.attach(entity, { field: value })      // Attach with specific changes
context.attachAll([entity1, entity2])         // Attach multiple entities
await context.update('Entity', id, changes)   // Update by primary key

// Cache management
context.getCacheStats()              // Get cache statistics
context.clearQueryCache()            // Clear all cached queries
context.endRequest()                 // End request and clear cache (like Active Record)
context.setQueryCacheEnabled(bool)   // Enable/disable caching

Query Methods

// Chainable query builders (do not execute query)
.where(query, ...params)         // Add WHERE condition
.and(query, ...params)           // Add AND condition
.orderBy(field)                  // Sort ascending
.orderByDescending(field)        // Sort descending
.skip(number)                    // Skip N records
.take(number)                    // Limit to N records
.include(relationship)           // Eager load
.cache()                         // Enable caching for this query (opt-in)

// Terminal methods (execute query - ALL REQUIRE AWAIT)
await .toList()                  // Return array of all records
await .single()                  // Return one or null
await .first()                   // Return first or null
await .count()                   // Return count
await .any()                     // Return boolean

// Convenience methods (REQUIRE AWAIT)
await .findById(id)              // Find by primary key
.new()                           // Create new entity instance (synchronous)

// Entity methods (Active Record style - REQUIRE AWAIT)
await entity.save()              // Save this entity (and all tracked changes)
await entity.delete()            // Delete this entity
await entity.reload()            // Reload from database, discarding changes
entity.clone()                   // Create a copy for duplication (synchronous)
entity.toObject(options)         // Convert to plain JavaScript object (synchronous)
entity.toJSON()                  // JSON.stringify compatibility (synchronous)

Entity Serialization

.toObject(options)

Convert a MasterRecord entity to a plain JavaScript object, removing all internal properties and handling circular references automatically.

Parameters:

  • options.includeRelationships (boolean, default: true) - Include related entities
  • options.depth (number, default: 1) - Maximum depth for relationship traversal

Examples:

// Basic usage - get plain object
const user = await db.User.findById(1);
const plain = user.toObject();
console.log(plain);
// { id: 1, name: 'Alice', email: '[email protected]', age: 28 }

// Include relationships
const userWithPosts = user.toObject({ includeRelationships: true });
console.log(userWithPosts);
// {
//   id: 1,
//   name: 'Alice',
//   Posts: [
//     { id: 10, title: 'First Post', content: '...' },
//     { id: 11, title: 'Second Post', content: '...' }
//   ]
// }

// Control relationship depth
const deep = user.toObject({ includeRelationships: true, depth: 3 });

// Exclude relationships
const shallow = user.toObject({ includeRelationships: false });

Circular Reference Protection:

.toObject() automatically prevents infinite loops from circular references:

// Scenario: User → Posts → User creates a cycle
const user = await db.User.findById(1);
await user.Posts;  // Load posts relationship

const plain = user.toObject({ includeRelationships: true, depth: 2 });
// Circular references marked as:
// { __circular: true, __entityName: 'User', id: 1 }

Why It's Needed:

MasterRecord entities have internal properties that cause JSON.stringify() to fail:

const user = await db.User.findById(1);

// ❌ FAILS: TypeError: Converting circular structure to JSON
JSON.stringify(user);

// ✅ WORKS: Use toObject() or toJSON()
const plain = user.toObject();
JSON.stringify(plain);  // Success!

.toJSON()

Used automatically by JSON.stringify() and Express res.json(). Returns the same as .toObject({ includeRelationships: false }).

Examples:

// JSON.stringify automatically calls toJSON()
const user = await db.User.findById(1);
const json = JSON.stringify(user);
console.log(json);
// '{"id":1,"name":"Alice","email":"[email protected]"}'

// Express automatically uses toJSON()
app.get('/api/users/:id', async (req, res) => {
    const user = await db.User.findById(req.params.id);
    res.json(user);  // ✅ Works automatically!
});

// Array of entities
app.get('/api/users', async (req, res) => {
    const users = await db.User.toList();
    res.json(users);  // ✅ Each entity's toJSON() called automatically
});

Entity Instance Methods

.delete()

Delete an entity without manually calling context.remove() and context.saveChanges().

Example:

// Before
const user = await db.User.findById(1);
db.remove(user);
await db.saveChanges();

// After (Active Record style)
const user = await db.User.findById(1);
await user.delete();  // ✅ Entity deletes itself

Cascade Deletion:

If your entity has cascade delete rules, they will be applied automatically:

class User {
    constructor() {
        this.id = { type: 'integer', primary: true, auto: true };

        // Posts will be deleted when user is deleted
        this.Posts = {
            type: 'hasMany',
            model: 'Post',
            foreignKey: 'user_id',
            cascade: true  // Enable cascade delete
        };
    }
}

const user = await db.User.findById(1);
await user.delete();  // ✅ Also deletes related Posts automatically

.reload()

Refresh an entity from the database, discarding any unsaved changes.

Example:

const user = await db.User.findById(1);
console.log(user.name);  // 'Alice'

user.name = 'Modified';
console.log(user.name);  // 'Modified'

await user.reload();  // ✅ Fetch fresh data from database
console.log(user.name);  // 'Alice' - changes discarded

Use Cases:

  • Discard unsaved changes
  • Refresh stale data after external updates
  • Synchronize after concurrent modifications
  • Reset entity to clean state

.clone()

Create a copy of an entity for duplication (primary key excluded).

Example:

const user = await db.User.findById(1);
const duplicate = user.clone();

duplicate.name = 'Copy of ' + user.name;
duplicate.email = '[email protected]';

await duplicate.save();
console.log(duplicate.id);  // ✅ New ID (different from original)

Notes:

  • Primary key is automatically excluded
  • Relationships are not cloned (set manually if needed)
  • Useful for templates and duplicating records

Query Helper Methods

.first()

Get the first record ordered by primary key.

Example:

// Automatically orders by primary key
const firstUser = await db.User.first();

// With custom order (respects existing orderBy)
const newestUser = await db.User
    .orderByDescending(u => u.created_at)
    .first();

// With conditions
const firstActive = await db.User
    .where(u => u.status == $$, 'active')
    .first();

.last()

Get the last record ordered by primary key (descending).

Example:

const lastUser = await db.User.last();

// With custom order
const oldestUser = await db.User
    .orderBy(u => u.created_at)
    .last();

.exists()

Check if any records match the query (returns boolean).

Example:

// Before
const count = await db.User
    .where(u => u.email == $$, '[email protected]')
    .count();
const exists = count > 0;

// After
const exists = await db.User
    .where(u => u.email == $$, '[email protected]')
    .exists();

if (exists) {
    throw new Error('Email already registered');
}

// Check if any users exist
const hasUsers = await db.User.exists();
if (!hasUsers) {
    // Create default admin user
}

.pluck(fieldName)

Extract a single column as an array.

Example:

// Get all active user emails
const emails = await db.User
    .where(u => u.status == $$, 'active')
    .pluck('email');
console.log(emails);
// ['[email protected]', '[email protected]', '[email protected]']

// Get all user IDs
const ids = await db.User.pluck('id');
console.log(ids);  // [1, 2, 3, 4, 5]

// With sorting
const recentEmails = await db.User
    .orderByDescending(u => u.created_at)
    .take(10)
    .pluck('email');

Lifecycle Hooks

Add lifecycle hooks to your entity definitions to execute logic before/after database operations.

Available Hooks:

  • beforeSave() - Execute before insert or update
  • afterSave() - Execute after insert or update
  • beforeDelete() - Execute before deletion
  • afterDelete() - Execute after deletion

Example:

const bcrypt = require('bcrypt');

class User {
    constructor() {
        this.id = { type: 'integer', primary: true, auto: true };
        this.email = { type: 'string' };
        this.password = { type: 'string' };
        this.created_at = { type: 'timestamp' };
        this.updated_at = { type: 'timestamp' };
        this.role = { type: 'string' };
    }

    // Hash password before saving
    beforeSave() {
        // Only hash if password was changed
        if (this.__dirtyFields.includes('password')) {
            this.password = bcrypt.hashSync(this.password, 10);
        }
    }

    // Set timestamps automatically
    beforeSave() {
        if (this.__state === 'insert') {
            this.created_at = new Date();
        }
        this.updated_at = new Date();
    }

    // Log after successful save
    afterSave() {
        console.log(`User ${this.id} saved successfully`);
    }

    // Prevent deleting admin users
    beforeDelete() {
        if (this.role === 'admin') {
            throw new Error('Cannot delete admin user');
        }
    }

    // Cleanup related data after deletion
    async afterDelete() {
        console.log(`User ${this.id} deleted, cleaning up related data...`);
        // Cleanup logic here (e.g., delete user files, clear cache)
    }
}

Usage:

// Hooks execute automatically during save
const user = db.User.new();
user.email = '[email protected]';
user.password = 'plain-text-password';
await user.save();
// ✅ beforeSave() hashes password automatically
// ✅ afterSave() logs success message

// Load and update
const user = await db.User.findById(1);
user.email = '[email protected]';
await user.save();
// ✅ beforeSave() sets updated_at timestamp
// ✅ Password not re-hashed (not in dirtyFields)

// Hooks can prevent operations
const admin = await db.User.where(u => u.role == $$, 'admin').single();
try {
    await admin.delete();
} catch (error) {
    console.log(error.message);  // "Cannot delete admin user"
}
// ✅ beforeDelete() prevented deletion

Hook Execution Order:

// Insert:
// 1. beforeSave()
// 2. SQL INSERT
// 3. afterSave()

// Update:
// 1. beforeSave()
// 2. SQL UPDATE
// 3. afterSave()

// Delete:
// 1. beforeDelete()
// 2. SQL DELETE
// 3. afterDelete()

Notes:

  • Hooks can be async (use async keyword)
  • Exceptions in before* hooks prevent the operation
  • Hooks execute for each entity during batch operations
  • Access entity state via this.__state ('insert', 'modified', 'delete')
  • Access changed fields via this.__dirtyFields array

Field Constraints & Indexes

Define database constraints and performance indexes using the fluent API:

class User {
    id(db) {
        db.integer().primary().auto();
    }

    email(db) {
        db.string()
          .notNullable()
          .unique()
          .index();  // Creates performance index
    }

    username(db) {
        db.string()
          .notNullable()
          .index('idx_username_custom');  // Custom index name
    }

    status(db) {
        db.string().nullable();
    }

    created_at(db) {
        db.timestamp().default('CURRENT_TIMESTAMP');
    }
}

Available Constraint Methods

  • .notNullable() - Column cannot be NULL
  • .nullable() - Column can be NULL (default)
  • .unique() - Unique constraint (enforces uniqueness at DB level)
  • .index() - Creates performance index (auto-generated name: idx_tablename_columnname)
  • .index('custom_name') - Creates index with custom name
  • .primary() - Primary key (automatically indexed)
  • .default(value) - Default value

Index vs Unique Constraint

Understanding the difference:

  • .unique() creates a UNIQUE constraint (prevents duplicate values, enforces data integrity)
  • .index() creates a performance index (improves query speed, allows duplicates)
  • You can use both together: .unique().index() creates a unique index for both integrity and performance

Examples:

// Email must be unique (no performance index)
email(db) {
    db.string().notNullable().unique();
}

// Username indexed for fast lookups (allows duplicates)
username(db) {
    db.string().notNullable().index();
}

// Email with both unique constraint AND performance index
email(db) {
    db.string().notNullable().unique().index();
}

Automatic Index Migration

When you add .index() to a field, MasterRecord automatically generates migration code:

// In your entity
class User {
    email(db) {
        db.string().notNullable().index();
    }
}

// Generated migration (automatic)
class Migration_20250101 extends masterrecord.schema {
    async up(table) {
        this.init(table);
        this.createIndex({
            tableName: 'User',
            columnName: 'email',
            indexName: 'idx_user_email'
        });
    }

    async down(table) {
        this.init(table);
        this.dropIndex({
            tableName: 'User',
            columnName: 'email',
            indexName: 'idx_user_email'
        });
    }
}

Rollback support:

Migrations automatically include rollback logic. Running masterrecord migrate down will drop all indexes created by that migration.


Composite Indexes

Create multi-column indexes for queries that filter or sort on multiple columns together.

API - Two Ways to Define

Option A: Entity Class (Recommended for core indexes)

class CreditLedger {
    id(db) {
        db.integer().primary().auto();
    }

    organization_id(db) {
        db.integer().notNullable();
    }

    created_at(db) {
        db.timestamp().default('CURRENT_TIMESTAMP');
    }

    resource_type(db) {
        db.string().notNullable();
    }

    resource_id(db) {
        db.integer().notNullable();
    }

    // Define composite indexes in entity
    static compositeIndexes = [
        // Simple array - auto-generates name
        ['organization_id', 'created_at'],
        ['resource_type', 'resource_id'],

        // With custom name
        {
            columns: ['status', 'created_at'],
            name: 'idx_status_timeline'
        },

        // Unique composite index
        {
            columns: ['email', 'tenant_id'],
            unique: true
        }
    ];
}

Option C: Context-Level (For environment-specific or centralized schema)

class AppContext extends context {
    onConfig() {
        this.dbset(CreditLedger);

        // Define composite indexes in context
        this.compositeIndex(CreditLedger, ['organization_id', 'created_at']);
        this.compositeIndex(CreditLedger, ['resource_type', 'resource_id']);
        this.compositeIndex(CreditLedger, ['status', 'created_at'], {
            name: 'idx_status_timeline'
        });
        this.compositeIndex(CreditLedger, ['email', 'tenant_id'], {
            unique: true
        });

        // Can also use table name as string
        this.compositeIndex('CreditLedger', ['user_id', 'created_at']);
    }
}

Combined Usage (Best of Both)

class User {
    email(db) { db.string(); }
    tenant_id(db) { db.integer(); }
    last_name(db) { db.string(); }
    first_name(db) { db.string(); }

    // Core indexes in entity
    static compositeIndexes = [
        ['last_name', 'first_name']
    ];
}

class AppContext extends context {
    onConfig() {
        this.dbset(User);

        // Add tenant-specific index for multi-tenant deployments
        if (process.env.MULTI_TENANT === 'true') {
            this.compositeIndex(User, ['tenant_id', 'email'], { unique: true });
        }

        // Add performance index for production
        if (process.env.NODE_ENV === 'production') {
            this.compositeIndex(User, ['tenant_id', 'last_name']);
        }
    }
}

When to Use Composite Indexes

Composite indexes are most effective for queries that:

  1. Filter on multiple columns: WHERE org_id = ? AND status = ?
  2. Filter and sort: WHERE status = ? ORDER BY created_at
  3. Enforce uniqueness: Unique constraint on multiple columns together

Example queries that benefit:

// Benefits from composite index (organization_id, created_at)
const ledger = await db.CreditLedger
    .where(c => c.organization_id == $$, orgId)
    .orderBy(c => c.created_at)
    .toList();

// Benefits from composite index (resource_type, resource_id)
const entry = await db.CreditLedger
    .where(c => c.resource_type == $$ && c.resource_id == $$, 'Order', 123)
    .single();

Column Order Matters

The order of columns in a composite index affects query performance:

static compositeIndexes = [
    // Index: (status, created_at)
    ['status', 'created_at']
];

// ✅ FAST: Uses index efficiently
// WHERE status = ? ORDER BY created_at
await db.Orders
    .where(o => o.status == $$, 'pending')
    .orderBy(o => o.created_at)
    .toList();

// ⚠️ SLOWER: Can only use first column
// WHERE created_at > ?
await db.Orders
    .where(o => o.created_at > $$, yesterday)
    .toList();

Rule of thumb: Put the most selective (filtered) columns first, then sort columns.

Automatic Migration Generation

// Your entity definition triggers migration
class CreditLedger {
    organization_id(db) { db.integer(); }
    created_at(db) { db.timestamp(); }

    static compositeIndexes = [
        ['organization_id', 'created_at']
    ];
}

// Generated migration (automatic)
class Migration_20250101 extends masterrecord.schema {
    async up(table) {
        this.init(table);
        this.createCompositeIndex({
            tableName: 'CreditLedger',
            columns: ['organization_id', 'created_at'],
            indexName: 'idx_creditleger_organization_id_created_at',
            unique: false
        });
    }

    async down(table) {
        this.init(table);
        this.dropCompositeIndex({
            tableName: 'CreditLedger',
            columns: ['organization_id', 'created_at'],
            indexName: 'idx_creditleger_organization_id_created_at',
            unique: false
        });
    }
}

Single vs Composite Indexes

class User {
    email(db) {
        db.string().index();  // Single-column index
    }

    first_name(db) {
        db.string();  // Part of composite below
    }

    last_name(db) {
        db.string();  // Part of composite below
    }

    static compositeIndexes = [
        // Composite index for name lookups
        ['last_name', 'first_name']
    ];
}

When to use single vs composite:

  • Single index: Column queried independently (WHERE email = ?)
  • Composite index: Columns queried together (WHERE last_name = ? AND first_name = ?)

Seed Data

Define seed data in your context file that automatically generates migration code using the ORM.

Context-Level Seed API (Recommended)

class AppContext extends context {
    onConfig() {
        // Single seed record
        this.dbset(User).seed({
            user_name: 'admin',
            first_name: 'System',
            last_name: 'Administrator',
            email: '[email protected]',
            system_role: 'system_admin',
            admin_type: 'engineering',
            onboarding_completed: 1,
            availability_status: 'online'
        });

        // Chain multiple records
        this.dbset(Post)
            .seed({ title: 'Welcome', content: 'Hello world', author_id: 1 })
            .seed({ title: 'Getting Started', content: 'Tutorial', author_id: 1 });

        // Bulk seed with array
        this.dbset(Category).seed([
            { name: 'Technology', slug: 'tech' },
            { name: 'Business', slug: 'biz' },
            { name: 'Science', slug: 'science' }
        ]);
    }
}

Automatic Migration Generation

When you define seed data in the context, MasterRecord generates migration code using the ORM:

// Your context definition triggers this migration
class Migration_20250205_123456 extends masterrecord.schema {
    async up(table) {
        this.init(table);

        // Generated ORM create calls
        await table.User.create({
            user_name: 'admin',
            first_name: 'System',
            last_name: 'Administrator',
            email: '[email protected]',
            system_role: 'system_admin',
            admin_type: 'engineering',
            onboarding_completed: 1,
            availability_status: 'online'
        });

        await table.Post.create({
            title: 'Welcome',
            content: 'Hello world',
            author_id: 1
        });

        await table.Post.create({
            title: 'Getting Started',
            content: 'Tutorial',
            author_id: 1
        });
    }

    async down(table) {
        this.init(table);
        // Seed data typically not removed in down migrations
    }
}

Benefits of ORM-Based Seeding

  1. Lifecycle Hooks: Triggers beforeSave and afterSave hooks
  2. Validation: Uses entity field definitions and validators
  3. Type Safety: Ensures fields match entity schema
  4. Maintainable: Changes to entity structure reflected automatically

Manual Seed Methods (Advanced)

For more control, use raw SQL seed methods directly in migrations:

class Migration_20250205_123456 extends masterrecord.schema {
    async up(table) {
        this.init(table);

        // Single record with raw SQL
        this.seed('User', {
            user_name: 'admin',
            email: '[email protected]'
        });

        // Bulk insert with raw SQL (more performant for large datasets)
        this.bulkSeed('Category', [
            { name: 'Technology', slug: 'tech' },
            { name: 'Business', slug: 'biz' },
            { name: 'Science', slug: 'science' }
        ]);
    }
}

When to use manual seed methods:

  • Large datasets (1000+ records) - bulkSeed() is more performant
  • Need raw SQL control
  • Don't need lifecycle hooks or validation

Idempotency

ORM approach (context-level seed):

  • Generates plain create() calls
  • Fails if primary key exists (user must remove seed data after first migration)
  • Best for one-time initial setup data

Manual approach (idempotent):

  • Uses database-specific INSERT OR IGNORE syntax
  • SQLite: INSERT OR IGNORE INTO
  • MySQL: INSERT IGNORE INTO
  • PostgreSQL: INSERT ... ON CONFLICT DO NOTHING
  • Best for repeatable migrations and re-seeding

Example:

// Context-level (runs once)
this.dbset(User).seed({ id: 1, name: 'admin' });
// After first migration, remove or comment out seed data

// Manual (repeatable)
class Migration_xyz extends masterrecord.schema {
    async up(table) {
        this.init(table);
        // Can run multiple times without error
        this.seed('User', { id: 1, name: 'admin' });
    }
}

Best Practices

  1. Use context-level seed for one-time initial setup (admin users, default categories)
    • Remove seed data from context after first successful migration
    • Or comment out after initial setup
  2. Use manual seed methods for repeatable/idempotent seeding
  3. Use manual bulkSeed for large datasets (1000+ records) - more performant
  4. Keep seed data minimal - only essential bootstrap data
  5. Use fixtures/factories for test data, not seed methods
  6. Don't delete seed data in down migrations (can cause referential integrity issues)

Example: Multi-Tenant Seed Data

class AppContext extends context {
    onConfig() {
        this.dbset(User);
        this.dbset(Tenant);
        this.dbset(Permission);

        // Seed default tenant
        this.dbset(Tenant).seed({
            name: 'Default Organization',
            slug: 'default',
            is_active: 1
        });

        // Seed system admin
        this.dbset(User).seed({
            email: '[email protected]',
            tenant_id: 1,
            role: 'system_admin'
        });

        // Seed default permissions
        this.dbset(Permission).seed([
            { name: 'users.read', description: 'Read users' },
            { name: 'users.write', description: 'Create/update users' },
            { name: 'users.delete', description: 'Delete users' }
        ]);
    }
}

Advanced Seed Data Features

MasterRecord provides 5 enterprise-grade seed data enhancements for production-ready data management:

1. Down Migrations - Automatic Rollback

Enable automatic cleanup of seed data in down migrations:

class AppContext extends context {
    onConfig() {
        // Enable down migration generation
        this.seedConfig({
            generateDownMigrations: true,  // Default: false
            downStrategy: 'delete',        // 'delete' | 'skip'
            onRollbackError: 'warn'        // 'warn' | 'throw' | 'ignore'
        });

        this.dbset(User).seed({ id: 1, name: 'admin', email: '[email protected]' });
    }
}

Generated Migration:

async up(table) {
    this.init(table);
    await table.User.create({ id: 1, name: 'admin', email: '[email protected]' });
}

async down(table) {
    this.init(table);
    // Auto-generated rollback (reverse order for FK safety)
    try {
        const record = await table.User.findById(1);
        if (record) await record.delete();
    } catch (e) {
        console.warn('Seed rollback: User id=1 not found');
    }
}

Use Cases:

  • Development environments where you frequently rollback migrations
  • Testing scenarios requiring clean database state
  • Staged deployments where rollback may be necessary

Note: Production environments typically don't rollback seed data due to referential integrity concerns.


2. Conditional Seeding - Environment-Based Data

Seed different data based on environment:

class AppContext extends context {
    onConfig() {
        // Development/test only seed data
        this.dbset(User)
            .seed({ name: 'Test User', email: '[email protected]' })
            .when('development', 'test');

        // Production-only seed data
        this.dbset(Config)
            .seed({ key: 'api_endpoint', value: 'https://api.production.com' })
            .when('production');

        // Multiple environments
        this.dbset(Feature)
            .seed({ name: 'beta_feature', enabled: true })
            .when('staging', 'production');
    }
}

How It Works:

  • Migration code is filtered at generation time (not runtime)
  • Only seed data matching current environment is included in migration
  • Cleaner migrations, no runtime overhead

Environment Detection:

  • Uses process.env.NODE_ENV or process.env.master
  • Defaults to 'development' if not set
  • Supports multiple environments per seed

3. Automatic Dependency Ordering

Seeds are automatically ordered based on foreign key relationships:

class AppContext extends context {
    onConfig() {
        // Order doesn't matter - automatically sorted!
        this.dbset(Post).seed({
            title: 'Welcome',
            user_id: 1  // Foreign key to User
        });

        this.dbset(