npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

mongoosql-core

v0.1.5

Published

Mongoose-style ORM that runs on MongoDB, PostgreSQL, and MySQL

Readme

mongoosql-core

Write Mongoose-style queries. Run them on MongoDB, PostgreSQL, or MySQL — without changing a line of query code.

// Define once
const User = db.model<IUser>('users', UserSchema);

// Works identically on all three engines
const users = await User
  .find({ age: { $gte: 18 }, role: { $ne: 'banned' } })
  .populate('orgId')
  .sort({ createdAt: -1 })
  .limit(20);

Table of Contents


Why mongoosql-core

Most ORMs lock you into one database. Mongoose is great but MongoDB-only. Prisma requires a schema file and its own DSL. Sequelize is SQL-only with a very different API.

mongoosql-core gives you the Mongoose API you already know — find, populate, $gt, $or, $set, $inc — and compiles it to native MongoDB aggregation pipelines or parameterised SQL LEFT JOIN queries depending on which engine you connect to. Switching databases is one config change.


Installation

Install mongoosql-core and only the driver(s) you need:

# MongoDB
npm install mongoosql-core mongodb

# PostgreSQL
npm install mongoosql-core pg

# MySQL
npm install mongoosql-core mysql2

Quick Start

import { createConnection, Schema } from 'mongoosql-core';

interface IUser {
  _id?: string;
  name: string;
  email: string;
  age: number;
  role: string;
}

const UserSchema = new Schema<IUser>({
  name:  { type: 'string', required: true },
  email: { type: 'string', required: true, unique: true },
  age:   { type: 'number', default: 0 },
  role:  { type: 'string', default: 'user' },
});

const db = await createConnection({
  engine:   'mongo',           // 'mongo' | 'postgres' | 'mysql'
  uri:      'mongodb://localhost:27017',
  database: 'myapp',
});

const User = db.model<IUser>('users', UserSchema);
await db.syncSchemas();

await User.insertOne({ name: 'Alice', email: '[email protected]', age: 28 });

const admins = await User.find({ role: 'admin' }).sort({ name: 1 });

Connecting

MongoDB

const db = await createConnection({
  engine:   'mongo',
  uri:      'mongodb://localhost:27017',
  database: 'myapp',
});

PostgreSQL

const db = await createConnection({
  engine:   'postgres',
  host:     'localhost',
  port:     5432,
  user:     'postgres',
  password: 'secret',
  database: 'myapp',
  ssl:      true,           // optional
});

MySQL

const db = await createConnection({
  engine:   'mysql',
  host:     'localhost',
  port:     3306,
  user:     'root',
  password: 'secret',
  database: 'myapp',
  ssl:      false,          // optional
});

Disconnecting

await db.disconnect();

Defining Schemas

Schemas define the shape and constraints of your data. They are identical regardless of which engine you use.

import { Schema } from 'mongoosql-core';

interface IPost {
  _id?: string;
  title: string;
  body: string;
  authorId: string;
  tags: string[];
  published: boolean;
  views: number;
}

const PostSchema = new Schema<IPost>({
  title:     { type: 'string',   required: true },
  body:      { type: 'string',   required: true },
  authorId:  { type: 'objectId', ref: 'users' },  // enables populate()
  tags:      { type: 'array',    items: 'string' },
  published: { type: 'boolean',  default: false },
  views:     { type: 'number',   default: 0 },
});

Field Options

| Option | Type | Description | |------------|-----------|----------------------------------------------------------| | type | string | Field type — see Schema Field Types | | required | boolean | Reject inserts missing this field | | unique | boolean | Add a unique constraint | | default | any | Default value or a function returning a value | | ref | string | Collection/table name to reference — enables populate() | | items | string | Element type for array fields |


Registering Models

const User = db.model<IUser>('users', UserSchema);
const Post = db.model<IPost>('posts', PostSchema);
const Org  = db.model<IOrg>('orgs',  OrgSchema);

Syncing Schemas

// Creates tables (SQL) or collections (MongoDB) if they do not exist
await db.syncSchemas();

Creating Documents

// insertOne — persists immediately
const user = await User.insertOne({
  name:  'Alice',
  email: '[email protected]',
  age:   28,
});

// insertMany
const users = await User.insertMany([
  { name: 'Bob',   email: '[email protected]',   age: 22 },
  { name: 'Carol', email: '[email protected]', age: 31 },
]);

// create() + save() — build in memory, then persist
const doc = User.create({ name: 'Dave', email: '[email protected]', age: 25 });
doc.set('role', 'editor');
await doc.save();    // INSERT

doc.set('age', 26);
await doc.save();    // UPDATE (document already has _id)

Querying

find / findOne / findById

const admins  = await User.find({ role: 'admin' });
const user    = await User.findOne({ email: '[email protected]' });
const byId    = await User.findById('64a1f3...');

Filter Operators

| Operator | SQL equivalent | Example | |------------|-------------------|------------------------------------------------------| | $eq | = value | { age: { $eq: 30 } } | | $ne | != value | { role: { $ne: 'banned' } } | | $gt | > value | { age: { $gt: 18 } } | | $gte | >= value | { age: { $gte: 18 } } | | $lt | < value | { age: { $lt: 65 } } | | $lte | <= value | { price: { $lte: 100 } } | | $in | IN (...) | { role: { $in: ['admin', 'editor'] } } | | $nin | NOT IN (...) | { status: { $nin: ['draft', 'deleted'] } } | | $like | LIKE '%val%' | { name: { $like: '%ali%' } } | | $exists | IS [NOT] NULL | { deletedAt: { $exists: false } } | | $or | ... OR ... | { $or: [{ role: 'admin' }, { age: { $gte: 30 } }] } | | $and | ... AND ... | { $and: [{ active: true }, { verified: true }] } |

On MongoDB, $like is transparently translated to a $regex — write the same filter on all engines.

Sorting, Pagination, Projection

const users = await User
  .find({ role: { $in: ['admin', 'editor'] } })
  .sort({ createdAt: -1, name: 1 })
  .skip(20)
  .limit(10)
  .select('name', 'email', 'role');

Updating

await User.updateOne(
  { email: '[email protected]' },
  { $set: { role: 'superadmin' } }
);

await Post.updateMany(
  { published: false },
  { $set: { published: true } }
);

await User.findByIdAndUpdate('64a1f3...', { $set: { age: 29 } });

Update Operators

| Operator | Description | Example | |----------|---------------------------|-----------------------------------------| | $set | Set field values | { $set: { role: 'admin' } } | | $unset | Remove / nullify fields | { $unset: { resetToken: 1 } } | | $inc | Increment a numeric field | { $inc: { views: 1, score: -5 } } |

Plain objects without operators are treated as $set:

await User.updateOne({ _id: id }, { role: 'editor', age: 26 });

Deleting

await User.deleteOne({ email: '[email protected]' });
await Post.deleteMany({ published: false });
await User.findByIdAndDelete('64a1f3...');

const user = await User.findById(id);
await user.remove();

Counting

const total       = await User.countDocuments();
const admins      = await User.countDocuments({ role: 'admin' });
const recentPosts = await Post.countDocuments({
  createdAt: { $gte: new Date('2024-01-01') },
});

Relations — populate() and join()

populate()

Define the relationship in the schema with ref, then call .populate() on any query:

const PostSchema = new Schema<IPost>({
  authorId: { type: 'objectId', ref: 'users' },
});

// All fields
const posts = await Post.find({ published: true }).populate('authorId');

// Specific fields only
const posts = await Post.find({}).populate('authorId', ['name', 'email']);

// Multiple
const users = await User.find({ role: 'admin' })
  .populate('orgId')
  .populate('managerId', ['name']);

Under the hood: MongoDB uses $lookup aggregation; SQL compiles to LEFT JOIN. The result shape is identical on all engines:

console.log(posts[0].authorId.name);  // 'Alice'

Raw join()

For full control — multiple joins, custom conditions, many-to-many:

const posts = await Post
  .find({ published: true })
  .join({
    from:         'users',
    localField:   'authorId',
    foreignField: '_id',
    as:           'author',
    single:       true,         // true = object, false = array
    select:       ['name', 'email', 'role'],
  })
  .sort({ views: -1 })
  .limit(10);

console.log(posts[0].author.name); // 'Alice'

Raw Queries

SQL (PostgreSQL — $1, $2 placeholders)

const { rows } = await db.raw<{ name: string; total: number }>(
  `SELECT u.name, SUM(o.amount) as total
   FROM users u JOIN orders o ON o.user_id = u._id
   GROUP BY u.name ORDER BY total DESC`
);

const { rowCount } = await db.raw(
  'UPDATE subscriptions SET status = $1 WHERE expires_at < $2',
  ['expired', new Date()]
);

SQL (MySQL — ? placeholders)

const { rows } = await db.raw(
  'SELECT * FROM users WHERE age > ? AND role = ?',
  [18, 'admin']
);

MongoDB (command document)

const { rows } = await db.raw({
  aggregate: 'orders',
  pipeline: [
    { $match: { status: 'completed' } },
    { $group: { _id: '$userId', total: { $sum: '$amount' } } },
  ],
  cursor: {},
});

RawResult type

interface RawResult<T = Record<string, any>> {
  rows:     T[];
  rowCount: number;
  fields?:  string[];
}

Query Builder — Chaining

Queries are lazy — nothing executes until you await or call .exec():

let q = User.find({ active: true });

if (searchTerm) q = q.where({ name: { $like: `%${searchTerm}%` } });
if (role)       q = q.where({ role });

q = q.sort({ createdAt: -1 }).limit(pageSize).skip(page * pageSize);

const users = await q;

| Method | Description | |---------------------|--------------------------------------------------| | .where(filter) | Merge additional filter conditions | | .sort(by) | { field: 1 \| -1 \| 'asc' \| 'desc' } | | .limit(n) | Maximum number of results | | .skip(n) | Results to skip (pagination) | | .select(...fields)| Return only specified fields | | .populate(path) | Populate a referenced field | | .join(spec) | Raw join specification | | .exec() | Explicitly execute the query |


Migrations

mongoosql-core ships with a first-class migration system. State is tracked in two places simultaneously: a _migrations table/collection in your database (primary source of truth) and a .mongoosql-migrations.json lockfile in your project root (backup, version-control friendly).

Config file

Create mongoosql.config.js in your project root — the CLI reads this to open a connection:

// mongoosql.config.js
const { createConnection } = require('mongoosql-core');

exports.createConnection = () => createConnection({
  engine:   'postgres',
  host:     process.env.DB_HOST     ?? 'localhost',
  user:     process.env.DB_USER     ?? 'postgres',
  password: process.env.DB_PASSWORD ?? 'secret',
  database: process.env.DB_NAME     ?? 'myapp',
});

CLI

# Scaffold a new migration file
mongoosql migrate generate "add status to users"
# → creates migrations/001-add-status-to-users.ts

# Run all pending migrations
mongoosql migrate up

# Run up to a specific migration (inclusive)
mongoosql migrate up --to 003-add-index

# Roll back the last batch
mongoosql migrate down

# Roll back a specific number of migrations
mongoosql migrate down --steps 2

# Show which migrations have been applied
mongoosql migrate status

CLI options

| Flag | Description | Default | |-----------------------|--------------------------------------------------|----------------------------------| | --config <path> | Path to config file | mongoosql.config.js | | --dir <path> | Migrations directory | ./migrations | | --to <name> | Run up only to this migration name | — | | --steps <n> | Number of migrations to roll back | entire last batch | | --tracking <name> | Collection/table used to track state | _migrations | | --lockfile <path> | Path to the lockfile | .mongoosql-migrations.json |

Programmatic API

const db = await createConnection({ ... });

await db.migrate.up();                         // run all pending
await db.migrate.up({ to: '003-add-index' });  // run up to a name
await db.migrate.down();                       // roll back last batch
await db.migrate.down({ steps: 2 });           // roll back 2
const statuses = await db.migrate.status();    // print + return status
const path = db.migrate.generate('add status to users'); // scaffold file

MigrateOptions — accepted by all programmatic methods:

interface MigrateOptions {
  migrationsDir?:      string;  // default: './migrations'
  trackingCollection?: string;  // default: '_migrations'
  lockfilePath?:       string;  // default: '.mongoosql-migrations.json'
  to?:                 string;  // run up to this migration name
  steps?:              number;  // migrations to roll back
}

Migration files

mongoosql migrate generate "add status to users" creates a zero-padded, sequenced file:

migrations/
  001-add-status-to-users.ts
  002-add-index-to-email.ts
  003-create-subscriptions-table.ts

Each file exports up() and down():

// migrations/001-add-status-to-users.ts
import { MigrationContext } from 'mongoosql-core';

export const description = 'add status to users';

export async function up(ctx: MigrationContext): Promise<void> {
  await ctx.addField('users', 'status', { type: 'string', default: 'active' });
  await ctx.addIndex('users', ['status']);
}

export async function down(ctx: MigrationContext): Promise<void> {
  await ctx.dropIndex('users', 'idx_users_status');
  await ctx.removeField('users', 'status');
}

down() must be the exact inverse of up() so rollbacks are safe.

Schema helpers

Every migration receives a MigrationContext with helpers that work across all three engines.

Fields

// Add a field / column (on MongoDB, backfills the default value on existing docs)
await ctx.addField('users', 'status', { type: 'string', default: 'active' });

// Remove a field / column
await ctx.removeField('users', 'legacyToken');

// Rename a field / column
await ctx.renameField('users', 'fullName', 'name');

// Change a column's type (SQL only — no-op on MongoDB)
await ctx.changeFieldType('orders', 'amount', { type: 'number' });

Collections / Tables

// Create a new collection or table with schema helpers
await ctx.createCollection('subscriptions', {
  userId:    { type: 'objectId', required: true, ref: 'users' },
  plan:      { type: 'string',   required: true },
  expiresAt: { type: 'date' },
  active:    { type: 'boolean',  default: true },
});

// Drop a collection or table entirely
await ctx.dropCollection('legacy_sessions');

Indexes

// Single-field unique index
await ctx.addIndex('users', ['email'], { unique: true });

// Compound index
await ctx.addIndex('posts', ['authorId', 'createdAt']);

// Custom index name
await ctx.addIndex('orders', ['status', 'userId'], {
  name: 'idx_orders_status_user',
});

// Drop by name
await ctx.dropIndex('users', 'idx_users_email');

Raw escape hatch

// SQL
await ctx.raw('ALTER TABLE "orders" ADD COLUMN "shippedAt" TIMESTAMP');

// SQL with parameters
await ctx.raw(
  'UPDATE "users" SET "role" = $1 WHERE "createdAt" < $2',
  ['legacy', new Date('2022-01-01')]
);

// MongoDB command
await ctx.raw({
  update:  'users',
  updates: [{ q: {}, u: { $set: { verified: false } }, multi: true }],
});

State tracking

Database — a _migrations table/collection is created automatically on first run:

| Column | Description | |-------------|---------------------------------------------------------------| | name | Migration filename without extension | | batch | Group number — all migrations in one up call share a batch | | appliedAt | Timestamp when the migration was applied |

Rolling back a migration removes its record. The table name can be changed with --tracking.

Lockfile.mongoosql-migrations.json is written after every up and down. Commit it to version control so your team can see migration state without hitting the database:

{
  "generated": "2024-11-15T10:30:00.000Z",
  "migrations": [
    { "name": "001-add-status-to-users", "batch": 1, "appliedAt": "2024-11-15T10:00:00.000Z" },
    { "name": "002-add-index-to-email",  "batch": 1, "appliedAt": "2024-11-15T10:00:01.000Z" }
  ]
}

Schema Field Types

| Type | MongoDB | PostgreSQL | MySQL | |------------|-------------------|------------|--------------| | string | String | TEXT | TEXT | | number | Number | NUMERIC | NUMERIC | | boolean | Boolean | BOOLEAN | TINYINT(1) | | date | Date | TIMESTAMP | TIMESTAMP | | objectId | ObjectId / String | TEXT | VARCHAR(255) | | array | Array | JSONB | JSON | | object | Object | JSONB | JSON |


API Reference

createConnection(config)

Creates and opens a database connection. Returns a Connection instance.

Connection

| Method / Property | Description | |---------------------------------|--------------------------------------------------------| | model<T>(name, schema) | Register a model | | syncSchemas() | Create missing tables / collections | | raw<T>(query, params?) | Execute a raw query | | migrate | Access the migration API | | disconnect() | Close the connection | | dialect | 'mongo' \| 'postgres' \| 'mysql' |

db.migrate

| Method | Description | |--------------------------|------------------------------------------| | up(options?) | Run all pending migrations | | down(options?) | Roll back the last batch | | status(options?) | Print and return migration status | | generate(name, dir?) | Scaffold a new migration file |

MigrationContext

| Method | Description | |----------------------------------------------|-------------------------------------------| | addField(collection, field, def) | Add a column / field | | removeField(collection, field) | Remove a column / field | | renameField(collection, oldName, newName) | Rename a column / field | | changeFieldType(collection, field, def) | Change a column's type (SQL only) | | createCollection(collection, fields) | Create a new table / collection | | dropCollection(collection) | Drop a table / collection | | addIndex(collection, fields, options?) | Add an index | | dropIndex(collection, indexName) | Drop an index by name | | raw(query, params?) | Execute a raw query | | dialect | 'mongo' \| 'postgres' \| 'mysql' |

Model<T>

| Method | Returns | |---------------------------------|-------------------------| | insertOne(data) | Document<T> | | insertMany(data[]) | Document<T>[] | | create(data) | Document<T> (unsaved) | | find(filter?) | Query<T> | | findOne(filter?) | Query<T> | | findById(id) | Document<T> \| null | | updateOne(filter, update) | Query<T> | | updateMany(filter, update) | Query<T> | | findByIdAndUpdate(id, update) | UpdateResult | | deleteOne(filter) | Query<T> | | deleteMany(filter) | Query<T> | | findByIdAndDelete(id) | DeleteResult | | countDocuments(filter?) | Query<T> | | syncSchema() | void |

Document<T>

| Method / Property | Description | |-------------------|--------------------------------------| | _id | The document's id | | get(key) | Get a field value | | set(key, value) | Set a field value (chainable) | | save() | INSERT if new, UPDATE if persisted | | remove() | Delete this document | | toObject() | Plain object copy | | toJSON() | Plain object copy (for serialisation)|


TypeScript

mongoosql-core is written in TypeScript and ships with full type declarations. Generic type parameters flow through the entire query chain:

interface IUser {
  _id?: string;
  name: string;
  email: string;
  age: number;
  role: 'user' | 'admin' | 'editor';
}

const User = db.model<IUser>('users', UserSchema);

const users = await User.find({ role: 'admin' });  // ✓
const users = await User.find({ foo: 'bar' });     // ✗ TS error

users[0].name; // string ✓
users[0].foo;  // ✗ TS error

// raw() supports generics
const { rows } = await db.raw<{ total: number }>(
  'SELECT SUM(amount) as total FROM orders'
);
rows[0].total; // number ✓

// MigrationContext is fully typed
export async function up(ctx: MigrationContext): Promise<void> {
  await ctx.addField('users', 'status', { type: 'string' });   // ✓
  await ctx.addField('users', 'status', { type: 'invalid' });  // ✗ TS error
}

Engine Differences

| Feature | MongoDB | PostgreSQL | MySQL | |-----------------------|----------------------------|----------------------------------|----------------------------------| | _id type | ObjectId → string | UUID (gen_random_uuid) | UUID() | | populate() | $lookup pipeline | LEFT JOIN | LEFT JOIN | | array fields | Native arrays | JSONB | JSON | | object fields | Native objects | JSONB | JSON | | $like | Translated to $regex | LIKE | LIKE | | raw() input | Command document | SQL string ($1) | SQL string (?) | | Schema sync | Creates collection | CREATE TABLE IF NOT EXISTS | CREATE TABLE IF NOT EXISTS | | createdAt | Manual | DEFAULT NOW() | DEFAULT NOW() | | updatedAt | Manual | DEFAULT NOW() | ON UPDATE NOW() | | addField migration | Backfills default value | ALTER TABLE ADD COLUMN | ALTER TABLE ADD COLUMN | | removeField | $unset all documents | ALTER TABLE DROP COLUMN | ALTER TABLE DROP COLUMN | | renameField | $rename all documents | ALTER TABLE RENAME COLUMN | ALTER TABLE CHANGE COLUMN | | addIndex | createIndexes command | CREATE INDEX IF NOT EXISTS | CREATE INDEX IF NOT EXISTS |


License

This project is licensed under the MIT License.

You are free to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of this software.

The software is provided “as is”, without warranty of any kind. The authors are not liable for any issues, damages, or losses arising from its use.