npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@m2k-5f/pgtx

v1.2.0

Published

Lightweight, high-performance SQL toolkit for node-postgres

Readme

Pgtx 🚀

A lightweight, high-performance SQL query builder for node-postgres (pg).
Experience ORM-like convenience (auto-inserts, updates, recursive fragments) with the transparency and speed of raw SQL.


🔥 Why Pgtx?

  • Zero-Cost Abstraction: Only ~2% - 0% overhead compared to raw pg.query.
  • Structural Caching: Uses WeakMap to cache SQL templates. Static parts are parsed only once.
  • Explicit Prepared Statements: Create and reuse prepared statements with type safety
  • True Recursion: Nest sql.fragment anywhere. Argument numbering ($1, $2) is managed automatically across all nesting levels.
  • No Magic: You write SQL, Pgtx handles the tedious parts (placeholders, identifiers, bulk inserts).
  • ACID Transactions: Reliable transaction management with automatic rollback on errors.

📦 Installation

npm install @m2k-5f/pgtx
# or
yarn add @m2k-5f/pgtx
# or
pnpm add @m2k-5f/pgtx

🚀 Quick Start

import { sql, Pool } from "@m2k-5f/pgtx";

const pool = new Pool({ /* pg.PoolConfig */ })

// Type-safe query with automatic placeholder ($1)
const [user] = await pool.query<User>`SELECT * FROM users WHERE id = ${1}`

⚡ Performance Benchmark

The following results were measured during sequential execution of 10,000 complex UPSERT queries.

| Tool | RPS | Avg. Query Time | Performance Overhead | | :--- | :---: | :---: | :---: | | Native pg.query | ~176 | 5.696 ms | 0% (Baseline) | | Pgtx | ~177 | 5.664 ms | -0.56%(zero-cost) | | Typical Node.js ORM | ~58 | 12.5+ ms | > 150% |

📖 Feature Guide

1. Recursive Fragments (sql.fragment)

Combine multiple SQL pieces. Perfect for dynamic filters or subqueries.

const filter = sql.fragment`status = ${'active'} AND age > ${21}`
const roleSub = sql.fragment`(SELECT id FROM roles WHERE name = ${'admin'})`

await pool.query`
  INSERT INTO users (name, role_id) 
  VALUES (${'Ivan'}, (${roleSub}))
  WHERE ${filter}
`
// SQL: INSERT INTO users ... VALUES ($1, (SELECT id FROM roles WHERE name = $2)) WHERE status = $3 AND age > $4

2. Transactions

Automatic BEGIN, COMMIT, and ROLLBACK. Use savepoint for nested logic.

await pool.begin(async (tx) => {
    await tx.query`UPDATE accounts SET balance = balance - 100 WHERE id = 1`;
    
    // Nested transaction (Savepoint)
    let err = await tx.savepoint('inventory', async (stx) => {
        await stx.query`UPDATE stock SET count = count - 1 WHERE item_id = ${42}`;
        if (outOfStock) throw new Error(); // Only 'inventory' rolls back
    });
    if (err) {
      // Handle error
      // tx still active & can be commited
    }
});
// Main transaction commits or rolls back based on callback success

3. Bulk Inserts (sql.insert)

Automatically extracts columns from the first object. Supports single objects and arrays.

const users = [
  { name: 'Alice', email: '[email protected]' },
  { name: 'Bob', email: '[email protected]' }
]

await pool.query`INSERT INTO users ${sql.insert(users)}`
// SQL: INSERT INTO users (name, email) VALUES ($1, $2), ($3, $4)

4. Prepared Statements

Pre-parse SQL on the database server for maximum performance in hot loops.

const stmt = await pool.prepare<User>("get_user_by_email", 'SELECT * FROM users WHERE email = ?')
// `Pgtx` automatically maps standard `?` placeholders to native `$1, $2` indexes.

const users = await stmt.execute('[email protected]')
// Statements created via pool.prepare are lazily initialized on each connection upon first use,
// gradually "warming up" the entire pool for peak performance.

5. Smart Lists (sql.array)

The ultimate tool for dynamic lists. Works for IN clauses, column lists, or joined conditions.

// 1. Classic IN clause
const ids = [10, 20]
await pool.query`SELECT * FROM users WHERE id IN (${sql.array(ids)})`
// SQL: SELECT * FROM users WHERE id IN ($1, $2)

// 2. Dynamic column list
const cols = [sql.ident('id'), sql.ident('name')]
await pool.query`SELECT ${sql.array(cols)} FROM users`
// SQL: SELECT "id", "name" FROM users

// 3. Dynamic WHERE conditions
const conds = [sql.fragment`active = true`, sql.fragment`age > ${18}`]
await pool.query`SELECT * FROM users WHERE ${sql.array(conds, ' AND ')}`
// SQL: SELECT * FROM users WHERE active = true AND age > $1

6. Dynamic Updates (sql.update)

Easily generate SET clauses from plain JavaScript objects.

const data = { status: 'pro', last_login: new Date() }

await pool.query`UPDATE users SET ${sql.update(data)} WHERE id = ${1}`
// SQL: UPDATE users SET status = $1, last_login = $2 WHERE id = $3

🔄 Null & Undefined Handling

Pgtx treats null and undefined differently to match SQL semantics and JavaScript expectations:

| Value | In INSERT | In UPDATE | In VALUES/Parameters | In Arrays | |-------|-----------|-----------|----------------------|-----------| | null | NULL | NULL | NULL | NULL | | undefined | DEFAULT | Field skipped | Error | Error |

Examples

// INSERT: undefined becomes DEFAULT
await pool.query`
  INSERT INTO users ${sql.insert({ 
    name: 'Alice', 
    age: undefined,  // -> DEFAULT
    email: null      // -> NULL
  })}
`
// SQL: INSERT INTO users (name, age, email) VALUES ($1, DEFAULT, $2)

// UPDATE: undefined fields are skipped
await pool.query`
  UPDATE users SET ${sql.update({ 
    name: 'Bob',
    age: undefined,  // skipped - age remains unchanged
    deleted_at: null // explicitly set to NULL
  })} WHERE id = 1
`
// SQL: UPDATE users SET name = $1, deleted_at = $2 WHERE id = 1

// Arrays: undefined throw
sql.array([1, undefined, 3]) // ❌ TypeError(`Array item at index 1 is undefined`)

// Empty arrays throw (use [null] for NULL result)
sql.array([]) // ❌ Error: Array clause is empty. Use sql.array([null])...

🛡️ Security

  • SQL Injection: Automatically uses native placeholders ($1, $2) for all values.
  • Identifiers: sql.ident safely escapes table and column names using double quotes.
  • Connection Leaks: pool.query uses try...finally internally to ensure connections are always returned to the pool.