npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@asaidimu/indexed

v4.0.1

Published

A simple and efficient way to interact with IndexedDB

Readme

@asaidimu/indexed

npm version License Build Status

A simple and efficient TypeScript library providing a document-oriented database interface for IndexedDB, complete with robust schema management, powerful querying, flexible pagination, an event-driven architecture, and built-in telemetry.

🔗 Quick Links


🚀 Overview & Features

@asaidimu/indexed is a modern TypeScript library designed to simplify interactions with the browser's native IndexedDB. It abstracts away the complexities of low-level IndexedDB APIs, offering a high-level, document-oriented interface that feels similar to popular NoSQL databases. This library empowers developers to manage structured data in the browser with ease, providing robust schema enforcement, flexible querying capabilities, and advanced features like migrations and performance monitoring.

It's ideal for single-page applications, progressive web apps, or any client-side project requiring persistent data storage that goes beyond simple key-value pairs, ensuring data integrity and a streamlined development experience. By providing a familiar API pattern, @asaidimu/indexed significantly reduces the learning curve and boilerplate typically associated with IndexedDB, allowing developers to focus on application logic rather than database mechanics.

✨ Key Features

  • Document-Oriented Interface: Interact with your data using familiar document patterns (create, read, update, delete) and an API that mirrors common NoSQL paradigms.
  • Comprehensive Schema Management: Define detailed data schemas using @asaidimu/anansi, including field types, validation constraints (e.g., minLength, min, unique), and indexes for optimized queries. Supports complex nested schemas.
  • Flexible Data Access: Retrieve documents using find (single document), filter (array of matching documents), and list (paginated documents) methods.
  • Advanced Querying: Leverage the powerful Query DSL (from @asaidimu/query) for expressive and efficient data retrieval, supporting various operators and logical combinations.
  • Pagination Support: Seamlessly paginate through large datasets using both offset-based and cursor-based strategies, returning asynchronous iterators for efficient batch processing.
  • Event-Driven Architecture: Subscribe to database, collection, and document-level events (document:create, document:write, document:update, document:delete, document:read, collection:create, collection:delete, collection:update, collection:read, migrate, telemetry) for real-time monitoring and reactive programming patterns.
  • Built-in Telemetry: Gain insights into database operation performance, arguments, and outcomes with an optional, pluggable telemetry system, crucial for debugging and optimization.
  • Schema Migrations: Define and apply schema changes over time using SchemaChange objects within SchemaDefinition, ensuring data compatibility and evolution across application versions. This leverages @asaidimu/anansi's MigrationEngine for data transformation streams.
  • Automatic ID Generation: New documents automatically receive a unique $id (UUID v4) if not explicitly provided, along with $created, $updated, and $version metadata.
  • TypeScript Support: Full type definitions ensure type safety and an excellent developer experience, with strong interfaces for all API components.

📦 Installation & Setup

Prerequisites

  • Node.js: v18 or higher recommended.
  • Package Manager: npm, yarn, or Bun.
  • Environment: A browser environment supporting IndexedDB. For Node.js testing, a compatible shim like fake-indexeddb and jsdom is used internally.

Installation Steps

To add @asaidimu/indexed to your project, use your preferred package manager:

# Using npm
npm install @asaidimu/indexed

# Using yarn
yarn add @asaidimu/indexed

# Using Bun
bun add @asaidimu/indexed

Configuration

The library requires minimal configuration during the DatabaseConnection initialization. You provide a database name, and can optionally enable telemetry.

import { DatabaseConnection } from '@asaidimu/indexed';

// Basic initialization: Connects to or creates 'myCoolAppDB'
const db = await DatabaseConnection({
    name: 'myCoolAppDB'
});

// Initialization with telemetry enabled: Useful for performance monitoring
const dbWithTelemetry = await DatabaseConnection({
    name: 'myCoolAppDB',
    enableTelemetry: true
});

// Advanced configuration
const dbAdvanced = await DatabaseConnection({
    name: 'myCustomDB',
    indexSchema: '$my_schema_metadata', // Custom name for the internal schema store
    keyPath: 'documentId', // Custom key path for documents (defaults to '$id')
    enableTelemetry: true,
    // Optional: provide custom predicates for schema validation from @asaidimu/anansi
    // predicates: {
    //     isEmail: (value: any) => typeof value === 'string' && value.includes('@')
    // }
    // validate: true // enables validation 
});

The database connection is internally cached, so subsequent calls to DatabaseConnection with the same name will return the existing instance.

Verification

After installation, you can quickly verify by attempting to import and initialize the database in your environment:

import { DatabaseConnection } from '@asaidimu/indexed';

async function verifyInstallation() {
    let db;
    try {
        db = await DatabaseConnection({ name: 'test_db_verification' });
        console.log('IndexedDB Document Store initialized successfully!');
    } catch (error) {
        console.error('Failed to initialize IndexedDB Document Store:', error);
    } finally {
        if (db) {
            db.close(); // Don't forget to close the connection to free resources
            console.log('Database connection closed.');
        }
    }
}

verifyInstallation();

📖 Usage Documentation

Basic Usage

Let's start with a simple example: defining a schema, creating a collection, and performing basic CRUD operations.

import { DatabaseConnection } from '@asaidimu/indexed';
import type { SchemaDefinition } from '@asaidimu/anansi'; // Crucial for robust schema definition

// 1. Define your document interface
// The '$id' property and other metadata ($created, $updated, $version)
// will be automatically added by the library to your Document instances.
interface Product {
    name: string;
    price: number;
    inStock: boolean;
    category?: string;
    tags?: string[];
}

// 2. Define your schema using SchemaDefinition from @asaidimu/anansi
// This schema will dictate the structure and validation rules for documents
// stored in the 'products' collection.
const productSchema: SchemaDefinition = {
    name: 'products',
    version: '1.0.0',
    description: 'Schema for product documents',
    fields: {
        // '$id' is the internal key path for IndexedDB, which defaults to "$id".
        // It's implicitly handled by the library and should not be explicitly defined here.
        name: { type: 'string', required: true, constraints: [{ name: 'minLength', parameters: 3 }] },
        price: { type: 'number', required: true, constraints: [{ name: 'min', parameters: 0 }] },
        inStock: { type: 'boolean', required: true },
        category: { type: 'string', required: false },
        tags: { type: 'array', required: false, itemsType: 'string' }
    },
    indexes: [
        { fields: ['name'], type: 'normal' },
        { fields: ['category'], type: 'normal' },
        { fields: ['price'], type: 'btree' },
        { fields: ['name', 'category'], type: 'composite', unique: true }
    ],
    constraints: [], // Schema-level constraints can be defined here
    migrations: [] // Migration plans for schema evolution
};

async function runExample() {
    let db; // Declare db here to ensure it's accessible in finally block
    try {
        // 3. Connect to the database with telemetry enabled
        db = await DatabaseConnection({ name: 'myCommerceDB', enableTelemetry: true });
        console.log('Database connected.');

        // Subscribe to database-level telemetry for all operations
        const unsubscribeDbTelemetry = db.subscribe("telemetry", (event) => {
            console.log(`[DB Telemetry] ${event.method} (Duration: ${event.metadata.performance.durationMs}ms)`);
        });

        // 4. Create or get your collection (equivalent to an IndexedDB Object Store)
        let productsCollection;
        try {
            productsCollection = await db.collection<Product>('products');
            console.log('Collection "products" already exists. Accessing it.');
        } catch (e: any) {
            // If the collection doesn't exist, create it using the defined schema.
            if (e.type === 'SCHEMA_NOT_FOUND') {
                productsCollection = await db.createCollection<Product>(productSchema);
                console.log('Collection "products" created successfully!');
            } else {
                throw e; // Re-throw other unexpected errors
            }
        }
        
        // Subscribe to collection-level events for the 'products' collection
        const unsubscribeCollectionRead = productsCollection.subscribe("collection:read", (event) => {
          console.log(`[Collection Event] Products collection read via method: '${event.method}'`);
        });

        // 5. Create a new document in the collection
        const newProduct = await productsCollection.create({
            name: 'Laptop Pro X',
            price: 1200.00,
            inStock: true,
            category: 'Electronics',
            tags: ['tech', 'gadget']
        });
        console.log('Created Product:', newProduct);

        // Documents have special meta-properties and methods.
        // '$id' is automatically generated (UUID v4 in this version)
        // AVOID depending on internal variables.
        console.log('New Product ID:', newProduct.$id);
        console.log('Product created at:', newProduct.$created);
        console.log('Product version:', newProduct.$version);

        // Subscribe to document-level events for the new product
        const unsubscribeProductUpdate = await newProduct.subscribe('document:update', (event) => {
            console.log(`[Document Event] Product updated: ID ${event.data?.$id}, new data: ${JSON.stringify(event.data)}`);
        });

        // 6. Find a document by query
        const foundProduct = await productsCollection.find({
            field: 'name', // Querying by the name prop
            operator: 'eq',
            value: newProduct.name
        });
        if (foundProduct) {
            console.log(`Found Product: ${foundProduct.name} `);
            console.log('Current price:', foundProduct.price);

            // 7. Update a document
            const updated = await foundProduct.update({ price: 1150.00, inStock: false });
            if (updated) {
                console.log('Product price updated to:', foundProduct.price);
                // The in-memory document instance is updated immediately,
                // but 'read()' ensures we have the absolute latest from the DB,
                // useful if other parts of the app might have modified it.
                await foundProduct.read();
                console.log('Product in stock status after read:', foundProduct.inStock);
            }

            // 8. Filter documents based on criteria
            await productsCollection.create({ name: 'Mechanical Keyboard', price: 75, inStock: true, category: 'Electronics' });
            await productsCollection.create({ name: 'Ergonomic Mouse', price: 40, inStock: true, category: 'Accessories' });
            await productsCollection.create({ name: 'Webcam 1080p', price: 60, inStock: false, category: 'Accessories' });

            const electronics = await productsCollection.filter({
                field: 'category',
                operator: 'eq',
                value: 'Electronics'
            });
            console.log('Electronics products:', electronics.map(p => p.name));

            // 9. List documents with pagination (offset-based example)
            console.log('Listing all products (offset pagination, 2 items per page):');
            const productIterator = await productsCollection.list({ type: 'offset', offset: 0, limit: 2 });
            let pageNum = 1;
            // Use for await...of to iterate over the async iterator
            for await (const batch of productIterator) {
                if (batch.length === 0) {
                  console.log('No more data.');
                  break;
                }
                console.log(`--- Page ${pageNum++} ---`);
                batch.forEach(p => console.log(`- ${p.name} ($${p.price})`));
            }
            
            // 10. Migrate the collection schema and data
            console.log('\n--- Running Collection Migration ---');
            await db.migrateCollection(productSchema.name, {
                changes: [{ type: "addField", name: "isMigrated", definition: { type: "boolean", default: true } }],
                description: "Add isMigrated field",
                transform: {
                    forward: (doc: any) => ({ ...doc, isMigrated: true }),
                    backward: (doc: any) => {
                        const { isMigrated, ...rest } = doc;
                        return rest;
                    }
                }
            });
            const migratedProduct = await productsCollection.find({ field: 'name', operator: 'eq', value: newProduct.name });
            console.log('Migrated Product (should have isMigrated: true):', migratedProduct?.state());

            // 11. Delete a document
            const deleted = await foundProduct.delete();
            if (deleted) {
                console.log(`Product "${foundProduct.name}" deleted successfully.`);
            }
        } else {
            console.log('Product not found after creation, which is unexpected.');
        }

        // Cleanup subscriptions
        unsubscribeProductUpdate();
        unsubscribeCollectionRead();
        unsubscribeDbTelemetry();

        // 12. Delete the entire collection
        await db.deleteCollection('products');
        console.log('Collection "products" deleted.');

    } catch (error) {
        console.error('An error occurred during the example run:', error);
    } finally {
        // 13. Ensure the database connection is closed
        if (db) {
            db.close();
            console.log('Database connection closed.');
        }
    }
}

runExample();

Database API

The Database interface provides methods for managing collections (equivalent to IndexedDB object stores) and subscribing to global database events.

import { DatabaseConnection } from '@asaidimu/indexed';
import type { Database, Collection, DatabaseEvent, DatabaseEventType, TelemetryEvent } from '@asaidimu/indexed';
import type { SchemaDefinition, PredicateMap, DataTransform, SchemaChange } from '@asaidimu/anansi';

interface DatabaseConfig {
    name: string; // The name of your IndexedDB database
    indexSchema?: string; // Optional: name for the internal schema index store (default: "$schema")
    keyPath?: string; // Optional: key path for all object stores (default: "$id")
    enableTelemetry?: boolean; // Optional: enables performance telemetry (default: false)
    predicates?: PredicateMap; // Optional: custom validation predicates for schemas
    validate?: boolean; // Optional: enables schema validation on data entry (default: false)
}

/**
 * Creates a new database connection or retrieves an existing one from an in-memory cache.
 * This is the primary entry point for interacting with the IndexedDB.
 * Subsequent calls with the same database name will return the cached instance.
 */
function DatabaseConnection(config: DatabaseConfig): Promise<Database>;

interface CollectionMigrationOptions {
  changes: SchemaChange<any>[]; // Array of schema changes to apply
  description: string; // Description of the migration
  rollback?: SchemaChange<any>[]; // Optional rollback changes
  transform?: string | DataTransform<any, any>; // Optional data transform function or string representing it
}

interface Database {
    /**
     * Accesses an existing collection (schema model) by name.
     * @param schemaName - The name of the schema/collection to access.
     * @returns A promise resolving to the schema's Collection instance.
     * @throws DatabaseError if the schema does not exist.
     */
    collection: <T>(schemaName: string) => Promise<Collection<T>>;

    /**
     * Creates a new collection (schema model) in the database.
     * This operation increments the database version to allow for object store creation.
     * @param schema - The schema definition for the new collection.
     * @returns A promise resolving to the created schema's Collection instance.
     * @throws DatabaseError if the schema already exists or is invalid.
     */
    createCollection: <T>(schema: SchemaDefinition) => Promise<Collection<T>>;

    /**
     * Deletes an existing collection (schema model) by name from the database.
     * This operation increments the database version to allow for object store deletion.
     * @param schemaName - The name of the schema/collection to delete.
     * @returns A promise resolving to `true` if successful.
     * @throws DatabaseError if the schema is not found or an internal error occurs.
     */
    deleteCollection: (schemaName: string) => Promise<boolean>;

    /**
     * Updates an existing collection's schema definition in the internal `$schema` metadata store.
     * This method does *not* modify the IndexedDB object store structure itself.
     * For actual structural changes (e.g., adding/removing indexes or object stores),
     * you typically need to manage IndexedDB database version upgrades manually or
     * by leveraging `@asaidimu/anansi`'s migration features combined with this library's `migrateCollection`.
     * @param schema - The updated schema definition.
     * @returns A promise resolving to `true` if successful.
     * @throws DatabaseError if the schema is not found or an internal error occurs.
     */
    updateCollection: (schema: SchemaDefinition) => Promise<boolean>;

    /**
     * Migrates an existing collection's data and updates its schema definition metadata.
     * This function processes data in a streaming fashion, using `@asaidimu/anansi`'s `MigrationEngine`
     * to apply schema changes and data transformations. All data and metadata updates
     * occur within a single atomic IndexedDB transaction.
     * Note: This function focuses on *data transformation* and *metadata updates*,
     * not structural IndexedDB changes which require a database version upgrade handled by `createCollection`/`deleteCollection`.
     * @param name - The name of the collection (IndexedDB object store) to migrate.
     * @param opts - Options for the migration, including schema changes, description, and an optional data transform.
     * @returns A Promise resolving to `true` if the migration completes successfully.
     * @throws DatabaseError if the collection or its schema metadata is missing, or any operation fails.
     */
    migrateCollection: (name: string, opts: CollectionMigrationOptions) => Promise<boolean>;

    /**
     * Subscribes to database-level events.
     * @param event - The event type to subscribe to (e.g., "collection:create", "telemetry").
     * @param callback - The function to call when the event occurs.
     * @returns An unsubscribe function.
     */
    subscribe: (
        event: DatabaseEventType | "telemetry",
        callback: (event: DatabaseEvent | TelemetryEvent) => void
    ) => () => void;

    /**
     * Closes the connection to the underlying IndexedDB database.
     * It's good practice to close connections when no longer needed to free up resources.
     */
    close: () => void;
}

Collection API

A Collection<T> provides methods for managing documents within a specific schema (object store). The generic type T represents the shape of your application data within this collection.

import type { Document, CollectionEvent, CollectionEventType, TelemetryEvent } from '@asaidimu/indexed';
import type { PaginationOptions, QueryFilter } from '@asaidimu/query';

interface Collection<T> {
  /**
   * Finds a single document matching the specified query.
   * @param query - The query filter to apply.
   * @returns A promise resolving to the matching document (as a Document<T> instance) or `null` if not found.
   */
  find: (query: QueryFilter<T>) => Promise<Document<T> | null>;

  /**
   * Lists documents based on the provided pagination options.
   * Supports both offset-based and cursor-based pagination.
   * @param query - The pagination options (e.g., limit, offset, cursor, direction).
   * @returns A promise resolving to an AsyncIterator, which yields arrays of Document<T>.
   */
  list: (query: PaginationOptions) => Promise<AsyncIterator<Document<T>[]>>;

  /**
   * Filters documents based on the provided query and returns all matching documents.
   * @param query - The query filter to apply.
   * @returns A promise resolving to an array of matching Document<T> instances.
   */
  filter: (query: QueryFilter<T>) => Promise<Document<T>[]>;

  /**
   * Creates a new document in this collection.
   * The document is automatically assigned internal metadata like `$id`, `$created`, and `$version` and persisted immediately.
   * @param initial - The initial data for the document.
   * @returns A promise resolving to the newly created Document<T> instance.
   */
  create: (initial: T) => Promise<Document<T>>;

  /**
   * Subscribes to collection-level events.
   * @param event - The event type to subscribe to (e.g., "collection:read", "telemetry").
   * @param callback - The function to call when the event occurs.
   * @returns An unsubscribe function.
   */
  subscribe: (
    event: CollectionEventType | TelemetryEventType,
    callback: (event: CollectionEvent<T> | TelemetryEvent) => void
  ) => () => void;
}

Document API

A Document<T> represents a single record in a collection and provides methods for interacting with that specific document. The generic type T represents your custom data shape, and the library automatically adds internal properties like $id, $created, $updated, and $version.

import type { TelemetryEvent, TelemetryEventType } from './telemetry';

export type Document<T> =
    {
        readonly [K in keyof T]: T[K]; // Your defined document properties, made read-only
    } &
    {
        /**
         * A unique identifier for the document. Automatically generated as a UUID v4
         * if not provided during creation. This is the IndexedDB key.
         */
        $id?: string;

        /**
         * A timestamp indicating when the document was created (ISO 8601 format).
         * Automatically set on creation.
         */
        $created?: string | Date;

        /**
         * A timestamp indicating when the document was last updated (ISO 8601 format).
         * Automatically updated on calls to `update()`.
         */
        $updated?: string | Date;

        /**
         * A number representing how many times the document has changed.
         * Incremented on calls to `update()`.
         */
        $version?: number;

        /**
         * Fetches the latest data for this document from the database.
         * Updates the in-memory document instance to reflect any changes.
         * @returns A promise resolving to `true` if successful and found, or `false` if an error occurs or not found.
         */
        read: () => Promise<boolean>;

        /**
         * Updates the document in the database with the provided partial properties.
         * Also updates the in-memory document instance and increments `$version` and `$updated`.
         * @param props - Partial object containing the fields to update.
         * @returns A promise resolving to `true` if successful, or `false` if an error occurs.
         */
        update: (props: Partial<T>) => Promise<boolean>;

        /**
         * Deletes the document from its collection in the database.
         * @returns A promise resolving to `true` if successful, or `false` if an error occurs.
         */
        delete: () => Promise<boolean>;

        /**
         * Subscribes to document-level events.
         * @param event - The event type to subscribe to (e.g., "document:update", "document:delete", "telemetry").
         * @param callback - The function to call when the event occurs.
         * @returns A promise resolving to an unsubscribe function.
         */
        subscribe: (
            event: DocumentEventType | TelemetryEventType,
            callback: (event: DocumentEvent<T> | TelemetryEvent) => void
        ) => Promise<() => void>;

        /**
         * Returns a structured clone of the current in-memory state of the document.
         * This provides a plain object representation without the document's methods.
         * @returns A deep copy of the document's data.
         */
        state(): T;
    }

/**
 * Event payload for DocumentModel events.
 */
export type DocumentEventType = "document:create" | "document:write" | "document:update" | "document:delete" | "document:read"; // The type of event.
export type DocumentEvent<T> = {
    type: DocumentEventType
    data?: Partial<T>; // The data associated with the event (e.g., updated fields).
    timestamp: number; // The time the event occurred.
};

Schema Definition

@asaidimu/indexed utilizes the SchemaDefinition from the external library @asaidimu/anansi to enforce data integrity and structure. This allows for rich schema definitions, including explicit field types, built-in and custom constraints, indexes for optimized queries, and a mechanism for defining migration plans to evolve your data over time.

For a detailed understanding of SchemaDefinition and its capabilities, please refer to the documentation for @asaidimu/anansi.

Key aspects of SchemaDefinition as used by @asaidimu/indexed include:

  • name: Unique identifier for the collection/schema (corresponds to an IndexedDB object store name).
  • version: Version string for the schema, important for tracking changes.
  • fields: A record defining each field's type (string, number, boolean, array, object, dynamic), required status, constraints, default values, and more.
  • indexes: Definitions for IndexedDB indexes, used for optimized queries on specific fields.
  • constraints: Schema-wide validation rules applied when documents are created or updated.
  • migrations: An array of Migration objects, each detailing atomic SchemaChange operations (e.g., addField, removeField, modifyField, addIndex, removeIndex, addConstraint). These are primarily used by @asaidimu/anansi's MigrationEngine which migrateCollection integrates with.

Querying

The find and filter methods of a Collection utilize the QueryFilter DSL from @asaidimu/query for expressive and flexible data retrieval. This powerful query language allows you to specify conditions, apply logical operators, and target specific fields.

import type { QueryFilter } from '@asaidimu/query';

// QueryFilter structure (simplified):
type QueryFilter<T> = {
    field: keyof T | string; // The field to query on 
    operator: "eq" | "ne" | "gt" | "gte" | "lt" | "lte" | "in" | "nin" | "contains" | "startsWith" | "endsWith" | "exists" | "notExists";
    value?: any; // The value to compare against
} | {
    operator: "and" | "or" | "not" | "nor" | "xor"; // Logical operators for combining conditions
    conditions: QueryFilter<T>[]; // Array of nested query filters
};

// Example usage:
// Find a user by email address
const userByEmail = await usersCollection.find({
    field: 'email',
    operator: 'eq',
    value: '[email protected]'
});

// Filter products that are in stock AND cost less than 100
const affordableInStock = await productsCollection.filter({
    operator: 'and',
    conditions: [
        { field: 'inStock', operator: 'eq', value: true },
        { field: 'price', operator: 'lt', value: 100 }
    ]
});

// Find products with 'laptop' in their name OR are in the 'Electronics' category
const relevantProducts = await productsCollection.filter({
    operator: 'or',
    conditions: [
        { field: 'name', operator: 'contains', value: 'laptop' },
        { field: 'category', operator: 'eq', value: 'Electronics' }
    ]
});

Pagination

The list method of a Collection provides robust pagination capabilities, allowing you to efficiently retrieve documents in batches. It supports both traditional offset-based pagination and more efficient cursor-based pagination, returning an AsyncIterator for seamless integration into for await...of loops.

import type { PaginationOptions } from '@asaidimu/query';
import type { Collection, Document } from '@asaidimu/indexed';

interface OffsetPaginationOptions {
    type: "offset"; // Specifies offset-based pagination
    offset: number; // The number of documents to skip from the beginning
    limit: number;  // The maximum number of documents to return in a batch
}

interface CursorPaginationOptions {
    type: "cursor";    // Specifies cursor-based pagination
    cursor?: string;   // Optional: The $id of the document to start (or continue) from
    direction: "forward" | "backward"; // The direction of iteration from the cursor
    limit: number;     // The maximum number of documents to return in a batch
}

type PaginationOptions = OffsetPaginationOptions | CursorPaginationOptions;

// Example: Offset-based pagination
async function fetchProductsOffset(productsCollection: Collection<Product>) {
    console.log('\n--- Fetching Products (Offset Pagination) ---');
    let currentPage = 0;
    const pageSize = 2; // Number of items per page

    while (true) {
        // The list method returns an AsyncIterator.
        // Calling .next() on it fetches the next batch based on the pagination options.
        const iterator = await productsCollection.list({
            type: "offset",
            offset: currentPage * pageSize,
            limit: pageSize
        });

        const { value: batch, done } = await iterator.next();

        if (batch.length > 0) {
            console.log(`Page ${currentPage + 1}:`);
            batch.forEach(product => console.log(`- ${product.name} (ID: ${product.$id})`));
            currentPage++;
        }

        // If the batch is empty or we've reached the end, stop.
        if (done || batch.length < pageSize) {
            console.log('--- End of Offset Pagination ---');
            break;
        }
    }
}

// Example: Cursor-based pagination (simple forward iteration)
async function fetchProductsCursor(productsCollection: Collection<Product>) {
    console.log('\n--- Fetching Products (Cursor Pagination) ---');
    let lastProductId: string | undefined = undefined; // Used as the cursor for the next batch
    const pageSize = 2;

    while (true) {
        const iterator = await productsCollection.list({
            type: "cursor",
            cursor: lastProductId,
            direction: "forward", // "forward" for 'next' cursor, "backward" for 'prev'
            limit: pageSize
        });

        const { value: batch, done } = await iterator.next();

        if (batch.length > 0) {
            console.log('Next Batch:');
            batch.forEach(product => console.log(`- ${product.name} (ID: ${product.$id})`));
            // Update the cursor to the ID of the last document fetched in this batch
            lastProductId = batch[batch.length - 1].$id;
        }

        if (done || batch.length < pageSize) { // If done or last batch is smaller than limit
            console.log('--- End of Cursor Pagination ---');
            break;
        }
    }
}

// To run these examples, ensure you have documents in your 'products' collection.
// e.g., await productsCollection.create({ name: 'Product A', price: 10, inStock: true });
// ... and so on for several products.

Telemetry

@asaidimu/indexed includes a built-in telemetry system that can be enabled during database initialization. This feature provides detailed performance metrics and contextual information for database operations, proving highly useful for debugging, performance monitoring, and analytics.

To enable telemetry when connecting to your database:

import { DatabaseConnection } from '@asaidimu/indexed';

const db = await DatabaseConnection({
    name: 'myAppDB',
    enableTelemetry: true
});

Once enabled, you can subscribe to telemetry events at the Database, Collection, or Document level to capture granular insights:

import type { TelemetryEvent } from '@asaidimu/indexed';

// Subscribe to database-level telemetry: captures all operations at the DB level
const unsubscribeDbTelemetry = db.subscribe("telemetry", (event: TelemetryEvent) => {
    console.log(`[DB Telemetry] Method: ${event.method}`);
    console.log(`Duration: ${event.metadata.performance.durationMs}ms`);
    if (event.metadata.error) {
        console.error(`Error: ${event.metadata.error.message}, Stack: ${event.metadata.error.stack}`);
    }
    console.log('Arguments:', event.metadata.args);
    console.log('Result:', event.metadata.result);
    console.log('Context:', event.metadata.context);
    console.log('Source:', event.metadata.source); // Indicates where the telemetry event originated (db, collection, document)
    console.log('---');
});

// Example usage to trigger DB telemetry
await db.createCollection({ name: 'users', version: '1.0.0', fields: { /* ... */ } });
unsubscribeDbTelemetry(); // Clean up

// Subscribe to collection-level telemetry: specific to operations on a collection
const productsCollection = await db.collection<Product>('products');
const unsubscribeCollectionTelemetry = productsCollection.subscribe("telemetry", (event: TelemetryEvent) => {
    console.log(`[Collection Telemetry - Products] Method: ${event.method}`);
    console.log(`Duration: ${event.metadata.performance.durationMs}ms`);
    console.log('---');
});

// Example usage to trigger Collection telemetry
await productsCollection.create({ name: 'New Gadget', price: 99, inStock: true });
await productsCollection.find({ field: 'name', operator: 'eq', value: 'New Gadget' });
unsubscribeCollectionTelemetry(); // Clean up

// Subscribe to document-level telemetry: for operations on a specific document
const myProduct = await productsCollection.find({ field: 'name', operator: 'eq', value: 'Laptop Pro X' });
if (myProduct) {
    const unsubscribeDocumentTelemetry = await myProduct.subscribe("telemetry", (event: TelemetryEvent) => {
        console.log(`[Document Telemetry - ${myProduct.name}] Method: ${event.method}`);
        console.log(`Duration: ${event.metadata.performance.durationMs}ms`);
        console.log('---');
    });
    await myProduct.update({ price: 1099.99 }); // This will trigger the document-level telemetry event
    unsubscribeDocumentTelemetry(); // Clean up
}

The TelemetryEvent structure provides comprehensive details about each captured operation:

type TelemetryEvent = {
    type: "telemetry";
    method: string; // The name of the method called (e.g., "create", "find", "updateCollection", "update")
    timestamp: number; // Unix timestamp (milliseconds) when the operation completed
    source:  any; // Internal source of the event, can be database, collection, or document level
    metadata: {
        args: any[]; // Arguments passed to the method
        performance: {
            durationMs: number; // Execution duration of the operation in milliseconds
        };
        source:  { // Specifies the level and associated entities (collection, document)
            level: "database" | "collection" | "document"
            collection?: string,
            document?: string
        },
        context: {
            userAgent: string | undefined; // Browser user agent string (from globalThis.navigator?.userAgent)
        };
        result?: {
            type: 'array' | string; // Type of the operation's result (e.g., 'array', 'object', 'number', 'boolean')
            size?: number; // Size if the result is an array (e.g., for list/filter operations)
        };
        error: {
            message: string;
            name: string;
            stack?: string;
        } | null; // Error details (message, name, stack trace) if the operation failed, null otherwise
    };
}

Error Handling

The library provides specific error types to help you handle different failure scenarios gracefully. All custom errors extend from DatabaseError, making it easy to catch and distinguish them from generic JavaScript errors.

import { DatabaseError, DatabaseErrorType } from '@asaidimu/indexed';
import type { SchemaDefinition } from '@asaidimu/anansi'; // For SchemaDefinition type

export enum DatabaseErrorType {
    /** The schema (collection) does not exist when trying to access or modify it. */
    SCHEMA_NOT_FOUND = "SCHEMA_NOT_FOUND",
    /** The schema (collection) already exists when trying to create a new one with the same name. */
    SCHEMA_ALREADY_EXISTS = "SCHEMA_ALREADY_EXISTS",
    /** The provided schema name is invalid (e.g., empty or reserved). */
    INVALID_SCHEMA_NAME = "INVALID_SCHEMA_NAME",
    /** The schema definition itself is malformed or violates validation rules. */
    INVALID_SCHEMA_DEFINITION = "INVALID_SCHEMA_DEFINITION",
    /** An attempt to subscribe to a database event failed. */
    SUBSCRIPTION_FAILED = "SUBSCRIPTION_FAILED",
    /** A generic internal error occurred during a database operation. */
    INTERNAL_ERROR = "INTERNAL_ERROR",
    /** Data being entered into a collection does not satisfy its schema definition. */
    INVALID_DATA = "INVALID_DATA",
}

export interface DatabaseError { // This is an interface, the class definition is below
    type: DatabaseErrorType; // The specific type of database error
    message: string; // A human-readable message describing the error.
    schema?: SchemaDefinition; // Associated schema if the error relates to a schema operation
}

export class DatabaseError extends Error {
    public type: DatabaseErrorType; // The specific type of database error
    public schema?: SchemaDefinition; // Associated schema if the error relates to a schema operation

    /**
     * Constructs a new DatabaseError instance.
     * @param type - The specific DatabaseErrorType.
     * @param message - A human-readable message describing the error.
     * @param schema - Optional: The SchemaDefinition related to the error, if applicable.
     */
    constructor(type: DatabaseErrorType, message: string, schema?: SchemaDefinition) {
        super(message);
        this.name = type; // Set the error name to the error type for easier identification
        this.type = type;
        this.schema = schema;
    }
}

// Example usage of error handling:
async function safeCreateCollection(db: Database, schema: SchemaDefinition) {
    try {
        await db.createCollection(schema);
        console.log(`Collection "${schema.name}" created successfully.`);
    } catch (error) {
        if (error instanceof DatabaseError) {
            // Handle specific database errors
            switch (error.type) {
                case DatabaseErrorType.SCHEMA_ALREADY_EXISTS:
                    console.warn(`Collection "${schema.name}" already exists. Skipping creation.`);
                    break;
                case DatabaseErrorType.INVALID_SCHEMA_DEFINITION:
                    console.error(`Invalid schema definition for "${schema.name}": ${error.message}`);
                    break;
                case DatabaseErrorType.INVALID_DATA:
                    console.error(`Invalid data provided for "${schema.name}": ${error.message}`);
                    break;
                case DatabaseErrorType.INTERNAL_ERROR:
                    console.error(`An internal database error occurred: ${error.message}`);
                    break;
                case DatabaseErrorType.SCHEMA_NOT_FOUND:
                    console.error(`Schema "${schema.name}" not found: ${error.message}`);
                    break;
                case DatabaseErrorType.SUBSCRIPTION_FAILED:
                    console.error(`Subscription failed: ${error.message}`);
                    break;
                default:
                    console.error(`Unhandled Database Error (${error.type}): ${error.message}`);
            }
        } else {
            // Handle unexpected non-DatabaseError errors
            console.error('An unexpected error occurred:', error);
        }
    }
}

Event System

The library leverages a lightweight event-driven design, allowing you to subscribe to various lifecycle and data-related events across the database, collections, and individual documents. This facilitates reactive programming, real-time updates, and integration with other parts of your application.

Database Events

Emitted from the Database instance. These events provide insights into schema management and database-wide activities.

  • collection:create: Triggered when a new collection has been successfully created.
  • collection:update: Triggered when an existing collection's schema metadata has been modified (e.g., via updateCollection).
  • collection:delete: Triggered when a collection has been successfully removed from the database.
  • collection:read: Triggered when a collection has been accessed via db.collection().
  • migrate: Triggered when a collection migration starts or ends.
  • telemetry: (If enableTelemetry is true) Provides performance and context data for database-level operations.
import { DatabaseConnection } from '@asaidimu/indexed';
import type { DatabaseEvent, DatabaseEventType } from '@asaidimu/indexed';

const db = await DatabaseConnection({ name: 'myAppDB' });

// Example: Log when a new collection is created
db.subscribe("collection:create", (event: DatabaseEvent) => {
    console.log(`[DB Event] New collection created: ${event.schema?.name} at ${new Date(event.timestamp).toLocaleString()}`);
});

// Example: Log when a collection is deleted
db.subscribe("collection:delete", (event: DatabaseEvent) => {
    console.log(`[DB Event] Collection deleted: ${event.schema?.name} at ${new Date(event.timestamp).toLocaleString()}`);
});

// Example: Log when a migration occurs
db.subscribe("migrate", (event: DatabaseEvent) => {
    console.log(`[DB Event] Migration event for schema: ${event.schema?.name}, Type: ${event.type} at ${new Date(event.timestamp).toLocaleString()}`);
});

// To trigger:
// await db.createCollection({ name: 'users', version: '1.0.0', fields: { /* ... */ } });
// await db.deleteCollection('users');
// await db.migrateCollection('myCollection', { changes: [], description: 'test' });

Collection Events

Emitted from a Collection instance. These events provide insights into document lifecycle actions within a specific collection.

  • document:create: Triggered when a new document is successfully created in this collection (often from collection.create()).
  • collection:read: Triggered when documents within this collection have been accessed via find, list, filter methods.
  • migration:start: Triggered when a migration process starts for this specific collection.
  • migration:end: Triggered when a migration process completes for this specific collection.
  • telemetry: (If enableTelemetry is true) Provides performance and context data for collection-level operations (e.g., find, list, filter, create).
import type { Collection, CollectionEvent, CollectionEventType } from '@asaidimu/indexed';
import type { Product } from './your-types-file'; // Assuming Product is defined

const productsCollection: Collection<Product> = await db.collection<Product>('products');

// Example: Log when a document is created in the products collection
productsCollection.subscribe("document:create", (event: CollectionEvent<Product>) => {
    console.log(`[Collection Event] Document created in '${event.model}' collection at ${new Date(event.timestamp).toLocaleString()}. Doc ID: ${event.document?.$id}`);
});

// Example: Log when documents are accessed (find, list, filter)
productsCollection.subscribe("collection:read", (event: CollectionEvent<Product>) => {
    console.log(`[Collection Event] Documents accessed in '${event.model}' collection using method: '${event.method}' at ${new Date(event.timestamp).toLocaleString()}`);
});

// To trigger:
// await productsCollection.create({ name: 'Test Product', price: 10, inStock: true });
// await productsCollection.find({ field: 'name', operator: 'eq', value: 'Test Product' });

Document Events

Emitted from a Document instance. These events provide granular details about changes and access to a specific document.

  • document:create: Triggered just after a new document instance is created and persisted, often from collection.create().
  • document:write: Triggered after a document is initially written to the store (e.g., by collection.create()).
  • document:update: Triggered when the document's properties have been successfully updated. The event payload includes the updated data.
  • document:delete: Triggered when the document has been successfully deleted from the database.
  • document:read: Triggered when the document's data has been read or accessed (e.g., via document.read() or during its initial retrieval/creation by a collection method).
  • telemetry: (If enableTelemetry is true) Provides performance and context data for document-level operations (e.g., read, update, delete).
import type { Document, DocumentEvent, DocumentEventType } from '@asaidimu/indexed';
import type { Product } from './your-types-file'; // Assuming Product is defined

const myProduct: Document<Product> = await productsCollection.create({
    name: 'Book', price: 25, inStock: true
});

// Example: Log when the specific document is updated
const unsubscribeUpdate = await myProduct.subscribe("document:update", (event: DocumentEvent<Product>) => {
    console.log(`[Document Event] Product (ID: ${event.data?.$id}) updated at ${new Date(event.timestamp).toLocaleString()}. New data:`, event.data);
});

// Example: Log when the specific document is deleted
const unsubscribeDelete = await myProduct.subscribe("document:delete", (event: DocumentEvent<Product>) => {
    console.log(`[Document Event] Product (ID: ${event.data?.$id}) deleted at ${new Date(event.timestamp).toLocaleString()}`);
});

// To trigger:
// await myProduct.update({ price: 30 }); // Triggers 'document:update'
// await myProduct.delete(); // Triggers 'document:delete'
// Remember to call unsubscribeUpdate() and unsubscribeDelete() when done.

🏗️ Project Architecture

@asaidimu/indexed is structured to provide a clear separation of concerns, from low-level IndexedDB interactions to high-level document management and event handling.

Core Components

  • DatabaseConnection (src/database.ts):
    • The primary entry point for the library, managing the lifecycle of the IndexedDB connection.
    • Handles IndexedDB versioning for object store creation/deletion.
    • Maintains an internal $schema object store to persist schema definitions, enabling robust schema management and migration.
    • Provides access to Collection instances.
  • Collection<T> (src/document.ts via createDocumentCursor):
    • Represents an abstraction over an IndexedDB object store (a collection of documents).
    • Provides high-level methods (create, find, filter, list) for interacting with documents within that store.
    • Integrates with @asaidimu/query for powerful filtering capabilities and src/paginate.ts for efficient list operations.
    • Manages collection-level events.
  • Document<T> (src/document.ts via createDocument):
    • Represents a single document (record) within a Collection.
    • Automatically injects internal metadata like $id (UUID v4), $created, $updated, and $version.
    • Exposes methods (read, update, delete, state) for manipulating the specific document.
    • Manages document-level events.
  • Store (src/store.ts):
    • A low-level wrapper providing direct, simplified access to IndexedDB's IDBObjectStore operations.
    • Handles IndexedDB transactions (executeTransaction, executeDatabaseTransaction), requests, and cursor management.
    • Used internally by createDocument and createDocumentCursor to perform core database operations.
  • Event Bus (@asaidimu/events):
    • A lightweight, integrated event system used across Database, Collection, and Document instances.
    • Facilitates internal communication and enables external subscriptions for reactive programming and monitoring.
  • Telemetry Proxy (src/utils.ts):
    • A Proxy-based decorator that wraps public API methods (on Database, Collection, and Document instances) if enableTelemetry is true.
    • Transparently captures method calls, execution time, arguments, results, and errors.
    • Emits structured telemetry events to the respective event buses for consumption.

Data Flow

  1. Connection Initialization: DatabaseConnection opens or re-uses an IndexedDB connection. This also ensures the internal $schema object store is created if it doesn't exist, which stores metadata about all user-defined collections.
  2. Schema & Collection Management:
    • db.createCollection(schema): First, validates the provided schema definition using @asaidimu/anansi. If valid, it triggers an IndexedDB version change by reopening the database with an incremented version, allowing a new object store to be created. The schema definition is then saved as a document in the internal $schema store.
    • db.collection(name): Retrieves the schema definition for the requested collection from the $schema store. It then returns a Collection instance, optionally configured with a schema validator if validate is enabled.
  3. Collection Operations:
    • Collection methods (find, list, filter, create) delegate to the low-level Store component specific to their object store.
    • For create, initial data is validated (if validation is enabled) and then passed to createDocument, which uses Store.put to persist the new document.
    • For query operations (find, list, filter), the Store's cursor method iterates records, and the @asaidimu/query's match function applies the filtering logic.
    • All data retrieved via find, list, filter is wrapped into interactive Document instances.
  4. Document Operations:
    • Document methods (read, update, delete) directly call Store methods (e.g., getById, put, delete) using the document's internal $id as the key.
    • document.update() performs schema validation on the updated data if enabled.
  5. Migrations:
    • db.migrateCollection(): Retrieves the current schema for the target collection from $schema store. It then uses @asaidimu/anansi's MigrationEngine to apply defined SchemaChange operations and optional DataTransform functions. Data is streamed out of the target object store, transformed, and streamed back in within a single atomic executeDatabaseTransaction. Finally, the updated schema definition is persisted back to the $schema store.
  6. Event Emission & Telemetry:
    • Throughout these operations, Database, Collection, and Document instances emit relevant lifecycle events (e.g., document:create, document:update, collection:read) via their internal event buses.
    • If enableTelemetry is active, the Telemetry Proxy intercepts public API calls, records performance metrics and context, and emits structured telemetry events before forwarding the original call.

Extension Points

  • Custom Schema Validation Predicates: Provide a predicates map to DatabaseConnection to extend the validation capabilities of @asaidimu/anansi for your schemas.
  • Telemetry: The pluggable telemetry system allows you to capture and process detailed operation insights for monitoring, debugging, or analytics.
  • Event System: Subscribe to a wide range of database, collection, and document events to implement reactive patterns, integrate with UI updates, or log application activity.

⚙️ Development & Contributing

We welcome contributions! Please read through these guidelines to get started.

Development Setup

  1. Clone the repository:
    git clone https://github.com/asaidimu/indexed.git
    cd indexed
  2. Install dependencies:
    bun install # or npm install or yarn install
  3. Build the project:
    bun run build # Compiles TypeScript source to dist/ for CJS and ESM formats.
    The postbuild script also copies README.md, LICENSE.md, and dist.package.json into the dist/ folder, preparing the package for npm publication.

Scripts

The package.json defines several useful scripts for development, building, and testing:

  • bun ci: Installs project dependencies.
  • bun clean: Removes the dist/ directory, cleaning up build artifacts.
  • bun prebuild: Executes bun clean and bun run .sync-package.ts (a utility to sync version and other details from package.json to dist.package.json).
  • bun build: Compiles TypeScript source files (index.ts) into dist/ for CommonJS (cjs) and ES Module (esm) formats, along with generating TypeScript declaration files (.d.ts).
  • bun postbuild: Copies essential files (README.md, LICENSE.md, dist.package.json) into the dist/ directory, which are included in the published npm package.
  • bun test: Runs unit and integration tests using Vitest in watch mode.
  • bun test:run: Executes all tests once and exits. Suitable for CI/CD pipelines.
  • bun test:debug: Runs tests in debug mode, useful for stepping through code.
  • bun test:ci: An alias for bun test:run, designed for continuous integration environments.

Testing

Tests are written with Vitest and provide comprehensive coverage of the library's functionality. To run the tests:

bun test

This will start Vitest in watch mode, automatically re-running tests on file changes. To run tests once (e.g., for CI or a quick check):

bun test:run

The tests are executed in a Node.js environment, simulating a browser using fake-indexeddb and jsdom. This ensures consistent and fast test execution without requiring a real browser.

Contributing Guidelines

Please review our CONTRIBUTING.md (placeholder link) for detailed information on:

  • Reporting bugs effectively.
  • Suggesting and discussing new features.
  • The process for making pull requests.
  • Our coding standards and commit message conventions (which follow Conventional Commits).

Issue Reporting

If you encounter any bugs, have feature requests, or questions, please open an issue on our GitHub Issues page. Provide as much detail as possible to help us understand and address your concerns.


📚 Additional Information

Troubleshooting

  • Database not opening/upgrading:
    • Ensure your browser supports IndexedDB.
    • When working directly with IndexedDB's indexedDB.open(), ensure you are providing a version number greater than the current version if you intend to create or modify object stores. @asaidimu/indexed handles this internally for createCollection and deleteCollection.
  • "SCHEMA_NOT_FOUND" error:
    • Verify that the collection name you are trying to access with db.collection() or db.deleteCollection() was previously created using db.createCollection().
    • Check for typos in the collection name.
  • "INVALID_DATA" error:
    • This error occurs when data being inserted or updated does not conform to the SchemaDefinition you provided for the collection, and validate: true is set in DatabaseConnection config. Review your data and schema constraints.
  • Data not persisting or updating:
    • Remember that all database operations are asynchronous and return Promises. Always use await or .then() to ensure operations complete and their results are handled.
    • For new documents, collection.create() automatically persists them. For existing Document instances, you must call document.update(props) to persist changes to the database.
  • Asynchronous operations:
    • It's crucial to handle the asynchronous nature of all API calls. Incorrect handling (e.g., missing await) can lead to unexpected behavior or errors.
  • Closing connections:
    • While IndexedDB connections are generally managed by the browser, explicitly calling db.close() when your application no longer needs the database connection is a good practice to free up resources and prevent resource leaks, especially in long-running applications or during testing.

FAQ

Q: Is this library a full-fledged database replacement? A: IndexedDB Document Store provides a robust client-side persistence layer for structured data, making it suitable for many web application needs (e.g., offline capabilities, caching, local data synchronization). It is not a replacement for server-side databases (like MongoDB, PostgreSQL) but aims to bring a similar document-oriented development experience to the browser's local storage.

Q: How does @asaidimu/indexed handle schema migrations? A: The SchemaDefinition (from @asaidimu/anansi) includes a migrations array where you can define a series of SchemaChange objects. The db.migrateCollection method uses @asaidimu/anansi's MigrationEngine to apply these changes and transform existing data. This process happens atomically within a single IndexedDB transaction, ensuring data integrity during schema evolution.

Q: Can I use this in a Node.js environment? A: IndexedDB is fundamentally a browser API. While this library is written in TypeScript and can be built for Node.js, using it directly in a Node.js server environment requires a polyfill like fake-indexeddb (which is used for testing) to simulate the browser's IndexedDB API. For server-side Node.js applications, a dedicated server-side database solution is generally more appropriate and performant.

Q: How do I handle large datasets with this library? A: IndexedDB itself is designed for significant client-side data storage, capable of holding gigabytes of data. @asaidimu/indexed enhances this with efficient cursor-based iteration and advanced pagination options (list method), making it suitable for managing large datasets by processing them in manageable batches rather than loading everything into memory at once.

Q: How are $id values generated? A: As of version 2.0.0, $id values for new documents are generated using UUID v4. This provides strong uniqueness guarantees without depending on content hashing, simplifying document creation.

Changelog / Roadmap

  • For a detailed history of changes, features, and bug fixes, please refer to the CHANGELOG.md file.
  • A formal roadmap is currently TBD, but common future considerations include more advanced query features, deeper integration with @asaidimu/anansi for complex schema validation and migrations, and potential performance optimizations through advanced IndexedDB features.

License

This project is licensed under the MIT License. See the LICENSE.md file for full details.

Acknowledgments