@asaidimu/utils-database
v1.1.1
Published
A collection of database utilities.
Readme
@asaidimu/utils-database
A flexible, schema-driven document database layer for browser and Node.js with built‑in validation, migrations, transactions, and telemetry.
Table of Contents
- Overview & Features
- Installation & Setup
- Usage Documentation
- Project Architecture
- Development & Contributing
- Additional Information
Overview & Features
@asaidimu/utils-database provides a high‑level, storage‑agnostic document database layer that works seamlessly in both browser (IndexedDB) and Node.js (memory) environments. It combines schema validation, optimistic concurrency control, and a flexible migration system to help you manage evolving data models without losing your mind.
The library was built to solve common pain points when working with client‑side storage: schema drift, concurrent edit conflicts, lack of transactions, and poor observability. By wrapping low‑level stores with a unified Collection / Document API, you get the same developer experience whether you are persisting data locally or just mocking storage for tests.
Key Features
- Schema‑driven validation – Define collections using
SchemaDefinition(compatible with Standard Schema) and automatically validate every write. - Multi‑backend support – Use IndexedDB for persistent browser storage or an ephemeral in‑memory store for testing.
- Optimistic Concurrency Control (OCC) – Every document has a
$versionfield; updates fail withCONFLICTif the version has changed. - Atomic transactions – Group multiple operations across collections and commit them together.
- Streaming migrations – Transform data in a memory‑efficient way using
ReadableStreamand batched writes. - Event system – Subscribe to database, collection, or document level events (create, update, delete, read).
- Built‑in telemetry – Track operation duration, errors, and metadata (optional).
- Middleware pipeline – Easily add retry logic, logging, or custom behaviour around any operation.
- Pagination helpers – Offset and cursor‑based pagination for lists.
Installation & Setup
Prerequisites
- Node.js 18+ or a modern browser (for IndexedDB)
- TypeScript 4.5+ (optional, but typings are included)
Install the package
npm install @asaidimu/utils-databaseThis package has the following peer dependencies – you need to install them manually:
npm install @asaidimu/anansi @asaidimu/eventsBasic Configuration
The library exports two factory functions for creating a database:
createIndexedDbStore– persistent storage using the browser’s IndexedDB.createEphemeralStore– in‑memory storage (useful for tests).
Example setup with IndexedDB:
import { DatabaseConnection } from "@asaidimu/utils-database";
import { createIndexedDbStore } from "@asaidimu/utils-database";
const db = await DatabaseConnection(
{
name: "my-app-db",
validate: true,
enableTelemetry: true,
predicates: {}, // custom validation predicates (optional)
},
createIndexedDbStore
);To verify that everything works, try creating a simple collection:
const userSchema = {
name: "users",
version: "1.0.0",
fields: {
id: { name: "id", type: "string", required: true, unique: true },
email: { name: "email", type: "string", required: true },
},
nestedSchemas: {},
};
const users = await db.createCollection(userSchema);
console.log("Collection ready");Usage Documentation
Basic CRUD
Create, read, update, and delete documents with an easy‑to‑use proxy API.
const users = await db.collection<{ id: string; email: string }>("users");
// Create
const doc = await users.create({ id: "user-1", email: "[email protected]" });
// Read a field directly
console.log(doc.email); // "[email protected]"
// Update
await doc.update({ email: "[email protected]" });
// Read latest data from store
await doc.read();
// Delete
await doc.delete();Documents automatically receive system properties: $id (UUID v7), $created, $updated, and $version.
Transactions
Group multiple operations across different collections into an atomic unit.
await db.transaction(async (tx) => {
const user = await users.create({ id: "u2", email: "[email protected]" }, tx);
const profile = await profiles.create({ userId: "u2", bio: "..." }, tx);
// both saves happen together
});If any operation throws, the entire transaction is rolled back.
Migrations
When you change a collection’s schema, you can migrate existing documents using a streaming, backpressure‑aware process.
await db.migrateCollection(
"users",
{
changes: [
{
type: "addField",
id: "isActive",
definition: { name: "isActive", type: "boolean" },
},
],
transform: {
forward: (doc) => ({ ...doc, isActive: true }),
backward: (doc) => {
const { isActive, ...rest } = doc;
return rest;
},
},
description: "Add active flag",
},
100 // batch size (optional)
);The migration runs inside a single transaction, updating both the documents and the stored schema definition.
Events & Telemetry
Subscribe to database, collection, or document events.
// Database‑level
const unsubDb = await db.subscribe("collection:create", (event) => {
console.log("New collection", event.schema.name);
});
// Collection‑level
const unsubCol = await users.subscribe("document:create", (event) => {
console.log("Document created", event.data);
});
// Document‑level
const unsubDoc = doc.subscribe("document:update", (event) => {
console.log("Updated at", event.timestamp);
});
// Telemetry (when enableTelemetry: true)
db.subscribe("telemetry", (event) => {
console.log(`${event.method} took ${event.metadata.performance.durationMs}ms`);
});Pagination & Filtering
Query documents using the @asaidimu/query filter syntax.
// Filter by field
const activeUsers = await users.filter({
field: "isActive",
operator: "eq",
value: true,
});
// Find a single document
const bob = await users.find({
field: "email",
operator: "eq",
value: "[email protected]",
});
// Offset pagination
const iterator = await users.list({
type: "offset",
offset: 0,
limit: 10,
});
const firstPage = await iterator.next();
// Cursor pagination (more efficient for large collections)
const cursorIter = await users.list({
type: "cursor",
limit: 10,
direction: "forward",
});Project Architecture
Core Components
DatabaseConnection– The main entry point. It creates a metadata store for schemas, sets up the event bus, and returns aDatabaseobject.Storeinterface – Low‑level storage abstraction. Implementations exist for IndexedDB and memory. All store operations work on clones to avoid mutation.Collection– A logical grouping of documents. It handles validation, indexing, and querying, and wraps each document in aDocumentproxy.Document– Proxy that exposes both data fields and methods (update,delete,read,subscribe). Implements OCC via version checks.Pipeline& Middleware – Intercepts every operation (collection or document) to add retries, telemetry, or custom logic.MigrationEngine– Computes schema transformations and produces a stream of migrated documents.TransactionContext– Buffers operations and commits them atomically using store‑level batching.
Data Flow
- A user calls
collection.create()→ the request goes through the middleware pipeline. - The
createDocumentfactory validates the input against the schema. - A new document is added to the store and indexed by
IndexManager. - Events are emitted (
document:create,collection:read). - All write operations perform OCC: they fetch the latest version, compare, and increment.
For migrations: migrateCollection streams documents from the store, passes them through the transformation engine, and writes them back in batches – all without loading the whole collection into memory.
Extension Points
- Custom stores – Implement the
Store<T>interface and pass your own factory toDatabaseConnection. - Custom predicates – Provide extra validation rules via the
predicatesconfig. - Middleware – Add global or per‑collection middleware to the pipeline (e.g., logging, metrics, custom error handling).
- Event listeners – React to any internal event to build audit logs, synchronisation, or analytics.
Development & Contributing
Development Setup
- Clone the repository:
git clone https://github.com/asaidimu/erp-utils.git cd erp-utils/src/database - Install dependencies:
npm install - Build the package:
npm run build # if defined, otherwise use tsc
Available Scripts
| Command | Description |
| ----------------- | -------------------------------------------------- |
| npm test | Run tests once (Vitest) |
| npm run test:watch | Run tests in watch mode |
| npm run test:browser | Run tests in a real browser (Vitest browser mode) |
Testing
The library uses Vitest with fake-indexeddb to simulate IndexedDB in Node. All store implementations must pass the same conformance test suite (testStoreImplementation). When adding a new feature, please include unit tests and ensure the existing tests pass.
Contributing Guidelines
- Fork the repository and create a feature branch.
- Follow the existing code style (Prettier, ESLint).
- Write clear commit messages following Conventional Commits.
- Open a Pull Request with a description of the changes and why they are needed.
- Ensure all tests pass and coverage does not decrease.
Issue Reporting
Use the GitHub issue tracker to report bugs or request features. Please include:
- A minimal code reproduction (if possible).
- Expected vs. actual behaviour.
- Environment (Node version, browser, package versions).
Additional Information
Troubleshooting
| Error | Likely cause & solution |
| --------------------------------- | --------------------------------------------------------- |
| SCHEMA_NOT_FOUND | The collection was not created. Call createCollection first. |
| CONFLICT | Another operation updated the document. Re‑fetch and retry. |
| TRANSACTION_FAILED | One of the batched writes failed. Check individual operations. |
| INVALID_DATA | The document does not match the schema. Review validation errors. |
FAQ
Q: Can I use this in a Node.js backend?
A: Yes, with the createEphemeralStore (in‑memory) or a future persistent backend (e.g., SQLite). The current IndexedDB store only works in browsers.
Q: How do migrations work with IndexedDB?
A: Migrations only transform data and update the schema metadata. Structural changes (e.g., creating new indexes) still require a database version upgrade (onupgradeneeded). The library does not automate that part yet.
Q: Does telemetry impact performance?
A: Telemetry adds minimal overhead – a few timestamp reads and an event emission. You can disable it with enableTelemetry: false.
Q: Can I use my own ID generator instead of UUID v7?
A: Yes – provide a document with your own $id field. The library will not overwrite it.
Changelog
See the CHANGELOG.md for version history.
License
This project is licensed under the MIT License. See the LICENSE file for details.
Acknowledgments
Built with ❤️ using:
@asaidimu/anansi– schema validation and migrations.@asaidimu/events– type‑safe event bus.@standard-schema/spec– Standard Schema interoperability.uuid– for v7 UUIDs.fake-indexeddb– testing support.
