shridb
v1.1.1
Published
The .json file & promise based javascript database for nodejs, nwjs, electron etc...
Downloads
21
Maintainers
Readme
shridb
The .json file & promise-based JavaScript database for Node.js, NW.js, Electron, and more...
Table of Contents
- Features
- Requirements
- Installation
- Quick Start
- API Reference
- Examples
- Testing
- Contributing
- FAQ
- Support
- Changelog
- License
Features
- JSON File-Based Storage - Each table is stored as a separate
.jsonfile - Promise-Based API - All operations return Promises for clean async/await code
- Modern Test Suite - Comprehensive tests using async/await syntax for better readability
- Lodash-Powered - Leverage the power of lodash for data manipulation
- Persistent Storage - Data survives application restarts using file-extra
- Table Operations - Create, read, update, delete tables easily
- Advanced Queries - Find, filter, sort, group, and aggregate data
- Array-Level Queries - Query nested arrays within objects
- Backup & Restore - Full database backup and restore functionality
- Auto-Increment Primary Keys - Configure auto-increment for any field
- Schema Validation - Define schemas with type checking, required fields, unique constraints, and more
- File Locking / Concurrency Control - Prevent data corruption when multiple processes access the database
- In-Memory Caching - Improve read performance with LRU cache and TTL support
- Data Compression - Reduce storage size with gzip compression for large databases
- Export to CSV/SQL - Enable data portability and integration with other systems
- Cross-Platform - Works with Node.js, NW.js, Electron, and more
Requirements
- Node.js version 12.0.0 or higher
- npm version 6.0.0 or higher (for installation)
Installation
npm install shridbOr using yarn:
yarn add shridbQuick Start
const Shridb = require('shridb');
// Initialize database with default settings
const db = new Shridb('myDatabase');
// Or with custom lock settings for concurrency control
const db2 = new Shridb('myDatabase', {
lockTimeout: 10000, // Max time to wait for lock (ms)
lockRetryInterval: 100 // Time between lock retries (ms)
});
// Or with caching enabled for better read performance
const db3 = new Shridb('myDatabase', {
cache: true, // Enable in-memory caching
cacheTTL: 60000, // Cache TTL in ms (default: 1 minute)
cacheMaxSize: 100 // Max tables to cache (default: 100)
});
// Or with compression enabled to reduce storage size
const db4 = new Shridb('myDatabase', {
compress: true, // Enable gzip compression
compressLevel: 6 // Compression level 1-9 (default: 6)
});
// Using async/await (recommended)
async function main() {
try {
// Insert data
await db.insert('users', [
{ id: 1, name: 'John Doe', email: '[email protected]' },
{ id: 2, name: 'Jane Smith', email: '[email protected]' }
]);
// Query data
const user = await db.findOne('users', { name: 'John Doe' });
console.log('Found user:', user);
} catch (err) {
console.error('Error:', err);
}
}
main();Or using traditional Promise chains:
const Shridb = require('shridb');
// Initialize database
const db = new Shridb('myDatabase');
// Insert data
db.insert('users', [
{ id: 1, name: 'John Doe', email: '[email protected]' },
{ id: 2, name: 'Jane Smith', email: '[email protected]' }
]).then(() => {
// Query data
return db.findOne('users', { name: 'John Doe' });
}).then((user) => {
console.log('Found user:', user);
}).catch((err) => {
console.error('Error:', err);
});API Reference
Database Methods
| Method | Description |
|--------|-------------|
| new Shridb(databaseName) | Initialize a new database |
| db.getTable(tableName) | Get or create a table |
| db.saveTable(table, tableData) | Save data to a table |
| db.insert(table, data) | Insert data into a table |
| db.removeTable(tableName) | Delete a table |
| db.cleanTable(table) | Clear all data in a table |
| db.setAutoIncrement(table, field, start) | Configure auto-increment |
| db.getAutoIncrement(table, field) | Get current auto-increment value |
| db.resetAutoIncrement(table, field) | Reset auto-increment sequence |
| db.removeAutoIncrement(table, field) | Remove auto-increment configuration |
Schema Validation Methods
| Method | Description |
|--------|-------------|
| db.defineSchema(tableName, schema) | Define a schema for a table |
| db.getSchema(tableName) | Get schema definition |
| db.removeSchema(tableName) | Remove schema definition |
| db.hasSchema(tableName) | Check if schema exists |
Query Methods
| Method | Description |
|--------|-------------|
| db.findOne(table, obj) | Find first matching record |
| db.findAll(table, obj) | Find all matching records |
| db.findIndex(table, obj) | Find index of first match |
| db.findLast(table, obj) | Find last matching record |
| db.findLastIndex(table, obj) | Find index of last match |
| db.query(tableName) | Create a query builder for advanced querying |
| db.paginate(tableName, options) | Paginate results with filtering and sorting |
Update Methods
| Method | Description |
|--------|-------------|
| db.updateOne(table, oldObj, newObj) | Update first match |
| db.updateAll(table, oldObj, newObj) | Update all matches |
| db.removeOne(table, obj) | Remove first match |
| db.pushBefore(table, obj, newObj) | Insert before specified object |
| db.pushAfter(table, obj, newObj) | Insert after specified object |
Aggregation Methods
| Method | Description |
|--------|-------------|
| db.getMax(table, property) | Get maximum value |
| db.getMin(table, property) | Get minimum value |
| db.doSum(table, property) | Sum of property values |
| db.unique(table, property) | Get unique values |
| db.map(table, property) | Map/extract property values |
| db.groupBy(table, value) | Group data by value |
| db.orderBy(table, prop, order) | Sort data |
| db.dateRangeFilter(table, dateProp, start, end) | Filter by date range |
Backup & Restore
| Method | Description |
|--------|-------------|
| db.backupDatabase(destPath, folderName) | Backup database |
| db.restoreDatabase(sourcePath) | Restore database |
Utilities
| Method | Description |
|--------|-------------|
| db.fetchJson(path) | Read JSON file |
| db.utils.trimString(text) | Trim whitespace |
| db.lock(tableName, timeout) | Lock a table for exclusive access |
| db.unlock(tableName) | Unlock a table |
| db.isLocked(tableName) | Check if a table is locked |
| db.withLock(tableName, fn) | Execute function with table lock |
Cache Methods
| Method | Description |
|--------|-------------|
| db.isCacheEnabled() | Check if cache is enabled |
| db.enableCache() | Enable in-memory caching |
| db.disableCache() | Disable in-memory caching |
| db.setCacheTTL(ttl) | Set cache time-to-live |
| db.clearCache(tableName) | Clear cache for a table or all tables |
| db.getCacheStats() | Get cache statistics |
Compression Methods
| Method | Description |
|--------|-------------|
| db.isCompressionEnabled() | Check if compression is enabled |
| db.enableCompression() | Enable gzip compression |
| db.disableCompression() | Disable gzip compression |
| db.setCompressionLevel(level) | Set compression level |
Export Methods
| Method | Description |
|--------|-------------|
| db.exportToCSV(tableName, options) | Export table data to CSV format |
| db.exportToSQL(tableName, options) | Export table data to SQL INSERT statements |
Methods
Let's consider our table named 'products' contains the following data for examples below.
[{
"id": 1,
"type": "donut",
"name": "Cake",
"batters": {
"batter": [
{ "id": 1001, "type": "Regular" },
{ "id": 1002, "type": "Chocolate" },
{ "id": 1003, "type": "Blueberry" },
{ "id": 1004, "type": "Devil's Food" }
]
},
"topping": [
{ "id": 5001, "type": "None", "price": 0 },
{ "id": 5002, "type": "Glazed", "price": 85 },
{ "id": 5005, "type": "Sugar", "price": 60 }
],
"price": 470
},
{
"id": 2,
"type": "meal",
"name": "Pizza",
"price": 795
}]db.getTable(tableName)
Creates or retrieves a table (JSON file). If the table doesn't exist, it creates an empty one.
db.getTable('products').then((tableData) => {
console.log(tableData);
}).catch((err) => {
console.log(err);
});Parameters:
tableName(String): The name of the table to get or create
db.saveTable(table, tableData)
Saves data to a table (overwrites existing data).
db.getTable('products').then((data) => {
data.push({ id: 3, name: 'Burger', price: 250 });
db.saveTable('products', data);
}).catch((error) => {
console.log(error);
});Parameters:
table(String): The name of the tabletableData(Array): Data to save
db.insert(table, object/arrayOfObjects)
Insert single or multiple objects into a table.
// Insert single object
db.insert('products', {
id: 3,
name: 'Burger',
price: 250
}).then((data) => {
console.log(data);
}).catch((error) => {
console.log(error);
});
// Insert multiple objects
db.insert('products', [
{ id: 4, name: 'Pasta', price: 300 },
{ id: 5, name: 'Salad', price: 150 }
]).then((data) => {
console.log(data);
});Parameters:
table(String): The name of the tableobject/arrayOfObjects: Single object or array of objects
db.getMax(table/array, property)
Returns the maximum value of a property.
db.getMax('products', 'id').then((maxValue) => {
console.log(maxValue); // 2
}).catch((error) => console.log(error));
// Query nested array
db.getTable('products').then((data) => {
db.getMax(data[0].topping, 'id').then((maxValue) => {
console.log(maxValue); // 5005
});
});Parameters:
table(String/Array): Table name or arrayproperty(String): Property name
db.getMin(table/array, property)
Returns the minimum value of a property.
db.getMin('products', 'price').then((minValue) => {
console.log(minValue); // 470
});Parameters:
table(String/Array): Table name or arrayproperty(String): Property name
db.updateOne(table/array, oldObj, newObj)
Updates the first occurrence of a matching object.
db.updateOne('products', { name: 'Cake' }, { name: 'Ice Cake' })
.then((data) => {
console.log(data);
});Parameters:
table(String/Array): Table name or arrayoldObj(Object): Object to findnewObj(Object): New object or partial object
db.updateAll(table/array, oldObj, newObj)
Updates all matching occurrences.
db.updateAll('products', { type: 'donut' }, { type: 'sweet' })
.then((data) => {
console.log(data);
});Parameters:
table(String/Array): Table name or arrayoldObj(Object): Object to findnewObj(Object): New object or partial object
db.findOne(table/array, obj)
Returns the first matching record.
db.findOne('products', { name: 'Pizza' }).then((pizza) => {
console.log(pizza);
});
// Chained queries
db.findOne('products', { name: 'Pizza' }).then((pizzaObj) => {
db.findOne(pizzaObj.topping, { type: 'Glazed' }).then(glazed => {
console.log(glazed);
});
});Parameters:
table(String/Array): Table name or arrayobj(Object): Query object
db.findAll(table/array, obj)
Returns all matching records.
// Find by property existence
db.findAll('products', 'id').then((foundData) => {
console.log(foundData);
});
// Find with function
db.findAll('products', (product) => product.price > 400)
.then((foundData) => {
console.log(foundData);
});Parameters:
table(String/Array): Table name or arrayobj(Object/Function): Query object or function
db.findIndex(table/array, obj)
Returns the index of the first matching record.
db.findIndex('products', { name: 'Cake' }).then((index) => {
console.log(index); // 0
});Parameters:
table(String/Array): Table name or arrayobj(Object/Function): Query object or function
db.findLast(table/array, obj)
Returns the last matching record.
db.findLast('products', { type: 'donut' }).then((lastDonut) => {
console.log(lastDonut);
});Parameters:
table(String/Array): Table name or arrayobj(Object/Function): Query object or function
db.findLastIndex(table/array, obj)
Returns the index of the last matching record.
db.findLastIndex('products', { type: 'meal' }).then((index) => {
console.log(index);
});Parameters:
table(String/Array): Table name or arrayobj(Object/Function): Query object or function
db.query(tableName)
Creates a query builder for advanced querying with chainable methods. Supports filtering, sorting, pagination, and more.
// Basic query with chaining
const users = await db.query('users')
.where({ role: 'admin' })
.orderBy('createdAt', 'desc')
.limit(10)
.offset(20)
.exec();Query Builder Methods:
| Method | Description |
|--------|-------------|
| .where(query) | Filter results by query object or function |
| .orderBy(prop, order) | Sort by property ('asc' or 'desc') |
| .asc(prop) | Sort by property in ascending order |
| .desc(prop) | Sort by property in descending order |
| .limit(n) | Limit number of results |
| .offset(n) | Skip number of results |
| .exec() | Execute query and return results |
| .first() | Execute query and return first result |
| .count() | Return count of matching records |
Examples:
// Filter and sort
const admins = await db.query('users')
.where({ role: 'admin' })
.orderBy('age', 'desc')
.exec();
// Pagination with offset
const page2 = await db.query('users')
.where({ active: true })
.orderBy('name', 'asc')
.limit(10)
.offset(10)
.exec();
// Get first match
const firstAdmin = await db.query('users')
.where({ role: 'admin' })
.orderBy('createdAt', 'asc')
.first();
// Count records
const adminCount = await db.query('users')
.where({ role: 'admin' })
.count();Parameters:
tableName(String): Table name to query
Returns: ShridbQuery - Query builder instance
db.paginate(tableName, options)
Paginate results with optional filtering and sorting. Returns an object containing paginated data and pagination metadata.
const result = await db.paginate('users', {
page: 1,
limit: 10,
query: { role: 'admin' },
orderBy: 'createdAt',
order: 'desc'
});
console.log(result.data); // Array of records
console.log(result.page); // Current page (1)
console.log(result.limit); // Items per page (10)
console.log(result.total); // Total matching records (25)
console.log(result.totalPages); // Total pages (3)
console.log(result.hasNext); // Has next page (true)
console.log(result.hasPrev); // Has previous page (false)Parameters:
tableName(String): Table nameoptions(Object): Pagination optionspage(Number): Page number (1-indexed), default: 1limit(Number): Items per page, default: 10query(Object/Function): Filter queryorderBy(String): Sort propertyorder(String): Sort order ('asc' or 'desc'), default: 'asc'
Returns: Promise<Object> - Object with:
data: Array of records for the current pagepage: Current page numberlimit: Items per pagetotal: Total number of matching recordstotalPages: Total number of pageshasNext: Boolean indicating if there's a next pagehasPrev: Boolean indicating if there's a previous page
db.removeOne(table/array, obj)
Removes the first matching record.
db.removeOne('products', { name: 'Cake' }).then((removed) => {
console.log(removed);
});Parameters:
table(String/Array): Table name or arrayobj(Object): Object to remove
db.map(tableName, property)
Extracts property values into an array.
db.map('products', 'name').then((names) => {
console.log(names); // ['Cake', 'Pizza']
});Parameters:
table(String/Array): Table name or arrayproperty(String): Property to extract
db.removeTable(tableName)
Deletes a table from the database.
db.removeTable('products').then((info) => {
console.log(info); // 'products' table removed
});Parameters:
tableName(String): Table to remove
db.cleanTable(table/array/object)
Clears all data from a table, array, or object.
// Clean table
db.cleanTable('products').then((info) => {
console.log(info); // []
});
// Clean array
db.cleanTable(array).then((info) => {
console.log(info); // []
});
// Clean object
db.cleanTable(object).then((info) => {
console.log(info); // {}
});Parameters:
table(String/Array/Object): Table, array, or object
db.setAutoIncrement(table, field, start)
Configures auto-increment for a specific field in a table. When auto-increment is configured, inserting objects without that field will automatically assign the next value.
// Configure auto-increment starting from 1000
await db.setAutoIncrement('users', 'id', 1000);
// Insert without id - will auto-assign 1000
await db.insert('users', { name: 'John Doe', email: '[email protected]' });
// Result: { id: 1000, name: 'John Doe', email: '[email protected]' }
// Insert another - will auto-assign 1001
await db.insert('users', { name: 'Jane Smith', email: '[email protected]' });
// Result: { id: 1001, name: 'Jane Smith', email: '[email protected]' }
// Custom field name (not just 'id')
await db.setAutoIncrement('products', 'sku', 100);
await db.insert('products', { name: 'Widget', price: 9.99 });
// Result: { sku: 100, name: 'Widget', price: 9.99 }Parameters:
table(String): Table namefield(String): Field name to auto-incrementstart(Number): Starting value (default: 1)
db.getAutoIncrement(table, field)
Returns the current auto-increment value for a configured field.
const currentValue = await db.getAutoIncrement('users', 'id');
console.log(currentValue); // 1002 (next value to be assigned)Parameters:
table(String): Table namefield(String): Field name
Returns: Current value or null if not configured
db.resetAutoIncrement(table, field)
Resets the auto-increment sequence back to its start value.
// After several inserts, reset back to start
await db.resetAutoIncrement('users', 'id');
// Next insert will use the start value again
await db.insert('users', { name: 'Reset User', email: '[email protected]' });
// Result: { id: 1000, name: 'Reset User', email: '[email protected]' }Parameters:
table(String): Table namefield(String): Field name
db.removeAutoIncrement(table, field)
Removes the auto-increment configuration for a field.
await db.removeAutoIncrement('users', 'id');
// Auto-increment is disabled, inserts won't auto-assign valuesParameters:
table(String): Table namefield(String): Field name
db.defineSchema(tableName, schema)
Defines a schema for a table to enforce data validation. Schemas are stored in _schemas.json and automatically validate data during insert and update operations.
// Define a schema with multiple validation rules
await db.defineSchema('users', {
id: { type: 'number', primary: true },
name: { type: 'string', required: true, minLength: 2, maxLength: 50 },
email: { type: 'string', unique: true },
age: { type: 'number', min: 0, max: 150 },
role: { type: 'string', enum: ['admin', 'user', 'guest'] }
});Schema Field Options:
| Option | Type | Description |
|--------|------|-------------|
| type | String | Data type: 'string', 'number', 'boolean', 'object', 'array', 'null' |
| required | Boolean | If true, field must be present and non-empty |
| primary | Boolean | Marks field as primary key (implies unique) |
| unique | Boolean | Ensures field value is unique across all records |
| min | Number | Minimum value (for numbers) |
| max | Number | Maximum value (for numbers) |
| minLength | Number | Minimum string/array length |
| maxLength | Number | Maximum string/array length |
| minItems | Number | Minimum array items |
| maxItems | Number | Maximum array items |
| pattern | String | Regex pattern (for strings) |
| enum | Array | Allowed values list |
Parameters:
tableName(String): Table nameschema(Object): Schema definition object
db.getSchema(tableName)
Returns the schema definition for a table.
const schema = await db.getSchema('users');
console.log(schema);
// { id: { type: 'number', primary: true }, name: { type: 'string', required: true }, ... }Parameters:
tableName(String): Table name
Returns: Schema object or null if not defined
db.removeSchema(tableName)
Removes the schema definition from a table.
await db.removeSchema('users');
// Schema is removed, validation is no longer enforcedParameters:
tableName(String): Table name
db.hasSchema(tableName)
Checks if a table has a schema defined.
const hasSchema = await db.hasSchema('users');
console.log(hasSchema); // true or falseParameters:
tableName(String): Table name
Returns: Boolean
Schema Validation Example
Here's a complete example demonstrating schema validation:
const Shridb = require('shridb');
const db = new Shridb('myApp');
async function main() {
// Define schema with validation rules
await db.defineSchema('users', {
id: { type: 'number', primary: true },
name: { type: 'string', required: true, minLength: 2 },
email: { type: 'string', unique: true },
age: { type: 'number', min: 0, max: 150 },
role: { type: 'string', enum: ['admin', 'user', 'guest'] }
});
// Insert valid data
await db.insert('users', {
id: 1,
name: 'John Doe',
email: '[email protected]',
age: 25,
role: 'admin'
});
// This will throw validation error - missing required field
try {
await db.insert('users', { id: 2, email: '[email protected]' });
} catch (error) {
console.log(error.message); // Validation failed: Field 'name' is required
}
// This will throw validation error - duplicate unique field
try {
await db.insert('users', {
id: 2,
name: 'Jane',
email: '[email protected]', // duplicate!
age: 30,
role: 'user'
});
} catch (error) {
console.log(error.message); // Validation failed: Field 'email' must be unique
}
// This will throw validation error - invalid enum value
try {
await db.insert('users', {
id: 2,
name: 'Jane',
email: '[email protected]',
age: 30,
role: 'superadmin' // not in enum!
});
} catch (error) {
console.log(error.message); // Validation failed: Field 'role' must be one of: admin, user, guest
}
// Insert valid data
await db.insert('users', {
id: 2,
name: 'Jane',
email: '[email protected]',
age: 30,
role: 'user'
});
// Update with validation
await db.updateOne('users', { id: 1 }, { age: 26 }); // Valid update
// This will throw validation error - unique constraint on update
try {
await db.updateOne('users', { id: 2 }, { email: '[email protected]' });
} catch (error) {
console.log(error.message); // Validation failed: Field 'email' must be unique
}
}
main();db.fetchJson(path)
Reads and returns data from a JSON file.
db.fetchJson('path/to/file.json').then((data) => {
console.log(data);
});Parameters:
path(String): Path to JSON file
db.doSum(table/array, property)
Returns the sum of a numeric property.
db.doSum('products', 'price').then((total) => {
console.log(total); // 1265
});
// Nested array
db.getTable('products').then((data) => {
db.doSum(data[0].topping, 'price').then((total) => {
console.log(total); // 145
});
});Parameters:
table(String/Array): Table name or arrayproperty(String): Property to sum
db.groupBy(table/array, value)
Groups data by a property value.
db.groupBy('products', 'type').then((grouped) => {
console.log(grouped);
// { donut: [...], meal: [...] }
});Parameters:
table(String/Array): Table name or arrayvalue(String): Property to group by
db.orderBy(table/array, sortProperty, orderBy)
Sorts data by property.
// Single property descending
db.orderBy('products', 'id', 'desc').then((data) => {
console.log(data);
});
// Multiple properties
db.orderBy('products', ['price', 'id'], ['asc', 'desc']).then((data) => {
console.log(data);
});Parameters:
table(String/Array): Table name or arraysortProperty(String/Array): Property or array of propertiesorderBy(String/Array): 'asc' or 'desc'
db.unique(table, property)
Returns unique values of a property.
db.unique('products', 'type').then((uniqueTypes) => {
console.log(uniqueTypes); // ['donut', 'meal']
});Parameters:
table(String/Array): Table name or arrayproperty(String): Property name
db.pushBefore(table, object, newObject)
Inserts a new object before an existing object.
db.pushBefore('products', { name: 'Pizza' }, { id: 3, name: 'Burger' })
.then((data) => {
console.log(data);
});Parameters:
table(String/Array): Table name or arrayobject(Object): Reference objectnewObject(Object): Object to insert
db.pushAfter(table, object, newObject)
Inserts a new object after an existing object.
db.pushAfter('products', { name: 'Cake' }, { id: 3, name: 'Muffin' })
.then((data) => {
console.log(data);
});Parameters:
table(String/Array): Table name or arrayobject(Object): Reference objectnewObject(Object): Object to insert
db.backupDatabase(destinationPath, folderName)
Creates a backup of the database.
db.backupDatabase('./backups', 'myBackup-2024')
.then((info) => {
console.log(info); // Database backup successful
})
.catch((error) => console.log(error));Parameters:
destinationPath(String): Directory for backupfolderName(String): Backup folder name
db.restoreDatabase(sourcePath)
Restores the database from a backup.
db.restoreDatabase('./backups/myBackup-2024')
.then((info) => {
console.log(info); // Database restored successfully
})
.catch((error) => console.log(error));Parameters:
sourcePath(String): Path to backup folder
db.dateRangeFilter(table, dateProperty, startDate, endDate)
Filters data by date range.
const employees = [
{ id: 1, name: 'John', joinDate: '2019-07-14T12:00:00Z' },
{ id: 2, name: 'Jane', joinDate: '2019-06-10T12:00:00Z' },
{ id: 3, name: 'Bob', joinDate: '2020-01-01T12:00:00Z' }
];
db.dateRangeFilter(employees, 'joinDate', '2019-01-01', '2019-12-31')
.then((data) => {
console.log(data);
});Parameters:
table(String/Array): Table name or arraydateProperty(String): Date property namestartDate(Date/String): Start dateendDate(Date/String): End date
db.utils.trimString(text)
Removes leading, trailing, and excess whitespace.
var name = db.utils.trimString(' John Doe ');
console.log(name); // 'John Doe'Parameters:
text(String): Text to trim
db.lock(tableName, timeout)
Locks a table for exclusive access. This prevents other processes from writing to the table until it's unlocked. All write operations (insert, updateOne, updateAll, removeOne, removeAll, saveTable, cleanTable, removeTable, pushAfter, pushBefore) automatically acquire and release locks.
// Lock a table for exclusive access
await db.lock('users');
// Perform multiple operations atomically
await db.insert('users', { name: 'John' });
await db.insert('users', { name: 'Jane' });
// Unlock when done
await db.unlock('users');
// Or with custom timeout (5 seconds)
await db.lock('users', 5000);Parameters:
tableName(String): Table name to locktimeout(Number, optional): Custom timeout in milliseconds (default: 5000)
Returns: true if lock acquired
db.unlock(tableName)
Releases a lock on a table.
await db.unlock('users');Parameters:
tableName(String): Table name to unlock
Returns: true if unlock successful
db.isLocked(tableName)
Checks if a table is currently locked.
const locked = await db.isLocked('users');
console.log(locked); // true or falseParameters:
tableName(String): Table name to check
Returns: Boolean
db.withLock(tableName, fn)
Executes an async function with a table lock. This is useful for wrapping multiple operations in a single atomic transaction.
// Perform multiple operations atomically
const result = await db.withLock('users', async () => {
// These operations will be protected by the lock
await db.insert('users', { name: 'John' });
await db.insert('users', { name: 'Jane' });
const all = await db.findAll('users');
return all.length;
});
console.log(result); // 2Parameters:
tableName(String): Table name to lockfn(Function): Async function to execute
Returns: Result of the function
db.isCacheEnabled()
Checks if in-memory caching is currently enabled.
const enabled = await db.isCacheEnabled();
console.log(enabled); // true or falseReturns: Boolean
db.enableCache()
Enables in-memory caching for the database. When enabled, frequently accessed table data will be cached in memory for improved read performance.
await db.enableCache();Returns: true if cache is now enabled
db.disableCache()
Disables in-memory caching for the database.
await db.disableCache();Returns: true if cache is now disabled
db.setCacheTTL(ttl)
Sets the cache time-to-live (TTL) in milliseconds. This determines how long cached data remains valid before being refreshed from disk.
// Set cache TTL to 30 seconds
await db.setCacheTTL(30000);
// Set cache TTL to 5 minutes
await db.setCacheTTL(300000);Parameters:
ttl(Number): TTL in milliseconds
Returns: true if TTL was set
Throws: Error if TTL is not a non-negative number
db.clearCache(tableName)
Clears the cache for a specific table or all tables. This is useful when you want to force a fresh read from disk.
// Clear cache for a specific table
await db.clearCache('users');
// Clear cache for all tables
await db.clearCache();Parameters:
tableName(String, optional): Table name to clear cache for. If not provided, clears all cache.
Returns: Number of cache entries cleared
db.getCacheStats()
Returns cache statistics including enabled status, TTL, size, and cached tables.
const stats = await db.getCacheStats();
console.log(stats);
// {
// enabled: true,
// ttl: 60000,
// maxSize: 100,
// currentSize: 2,
// cachedTables: ['users', 'products']
// }Returns: Promise - Object with:
enabled: Boolean indicating if cache is enabledttl: Cache TTL in millisecondsmaxSize: Maximum cache sizecurrentSize: Current number of cached tablescachedTables: Array of cached table names
db.isCompressionEnabled()
Checks if gzip compression is currently enabled.
const enabled = await db.isCompressionEnabled();
console.log(enabled); // true or falseReturns: Boolean
db.enableCompression()
Enables gzip compression for table files. When enabled, all table data will be compressed using gzip before being written to disk, reducing storage size.
await db.enableCompression();Returns: true if compression is now enabled
Note: When compression is enabled, table files will be saved with .json.gz extension. The database will automatically handle reading and writing compressed files.
db.disableCompression()
Disables gzip compression for table files.
await db.disableCompression();Returns: true if compression is now disabled
db.setCompressionLevel(level)
Sets the gzip compression level. Higher levels provide better compression but take more time.
// Set to maximum compression (slower but smallest file size)
await db.setCompressionLevel(9);
// Set to fastest compression (larger files but faster)
await db.setCompressionLevel(1);
// Default level 6 (balanced)
await db.setCompressionLevel(6);Parameters:
level(Number): Compression level from 1 (fastest) to 9 (best compression)
Returns: true if level was set
Throws: Error if level is not a number between 1 and 9
Data Compression Example
Here's a complete example demonstrating data compression:
const Shridb = require('shridb');
// Create database with compression enabled
const db = new Shridb('myApp', {
compress: true, // Enable gzip compression
compressLevel: 6 // Compression level (1-9, default: 6)
});
async function compressionExample() {
// Check if compression is enabled
console.log('Compression enabled:', db.isCompressionEnabled());
// Insert data - will be automatically compressed
await db.insert('users', [
{ name: 'John Doe', email: '[email protected]', bio: 'Lorem ipsum dolor sit amet...' },
{ name: 'Jane Smith', email: '[email protected]', bio: 'Consectetur adipiscing elit...' }
]);
// Data is automatically decompressed when reading
const users = await db.getTable('users');
console.log('Users:', users);
// Dynamic compression control
db.disableCompression();
db.enableCompression();
// Change compression level
db.setCompressionLevel(9); // Maximum compression
}
compressionExample();Compression Features:
- Uses gzip compression to reduce storage size
- Compressed files use
.json.gzextension - Automatic decompression when reading data
- Configurable compression level (1-9)
- System files (_schemas.json, _sequences.json) are not compressed
- Works with all write operations (insert, update, delete, etc.)
db.exportToCSV(tableName, options)
Export table data to CSV format. Supports exporting to a file or returning as a string.
// Export to CSV string
const csv = await db.exportToCSV('users');
console.log(csv);
// id,name,email,age
// 1,John Doe,[email protected],25
// 2,Jane Smith,[email protected],30
// Export to file
await db.exportToCSV('users', { filePath: './exports/users.csv' });
// Export with custom delimiter
const tsv = await db.exportToCSV('users', { delimiter: '\t' });
// Export specific fields only
const csv = await db.exportToCSV('users', { fields: ['id', 'name', 'email'] });
// Export without header row
const csv = await db.exportToCSV('users', { includeHeader: false });
// Export array directly
const data = [{ name: 'John', age: 25 }, { name: 'Jane', age: 30 }];
const csv = await db.exportToCSV(data);Parameters:
tableName(String/Array): Table name or array of objectsoptions(Object): Export optionsfilePath(String): Optional file path to save CSVdelimiter(String): CSV delimiter character (default: ',')includeHeader(Boolean): Include header row (default: true)fields(Array): Specific fields to export (exports all if not provided)
Returns: Promise<String> - CSV string or file path message
db.exportToSQL(tableName, options)
Export table data to SQL INSERT statements. Supports multiple database types (SQLite, MySQL, PostgreSQL).
// Export to SQL string (SQLite format by default)
const sql = await db.exportToSQL('users');
console.log(sql);
// INSERT INTO users (id, name, email, age) VALUES (1, 'John Doe', '[email protected]', 25);
// INSERT INTO users (id, name, email, age) VALUES (2, 'Jane Smith', '[email protected]', 30);
// Export to file
await db.exportToSQL('users', { filePath: './exports/users.sql' });
// Export with CREATE TABLE statement
const sql = await db.exportToSQL('users', { includeSchema: true });
// CREATE TABLE IF NOT EXISTS users (
// id INTEGER,
// name TEXT,
// email TEXT,
// age INTEGER
// );
// INSERT INTO users ...
// Export for MySQL
const mysqlSql = await db.exportToSQL('users', { database: 'mysql' });
// Export for PostgreSQL
const pgSql = await db.exportToSQL('users', { database: 'postgresql' });
// Export with custom table name
const sql = await db.exportToSQL('users', { tableName: 'app_users' });
// Export array directly
const data = [{ name: 'John', age: 25 }, { name: 'Jane', age: 30 }];
const sql = await db.exportToSQL(data);Parameters:
tableName(String/Array): Table name or array of objectsoptions(Object): Export optionsfilePath(String): Optional file path to save SQLtableName(String): Custom table name for SQL (uses original if not provided)includeSchema(Boolean): Include CREATE TABLE statement (default: false)database(String): Database type: 'sqlite', 'mysql', 'postgresql' (default: 'sqlite')
Returns: Promise<String> - SQL string or file path message
Export to CSV/SQL Example
Here's a complete example demonstrating data export:
const Shridb = require('shridb');
const db = new Shridb('myApp');
async function exportExample() {
// Insert sample data
await db.insert('users', [
{ id: 1, name: 'John Doe', email: '[email protected]', age: 25 },
{ id: 2, name: 'Jane Smith', email: '[email protected]', age: 30 },
{ id: 3, name: 'Bob Wilson', email: '[email protected]', age: 35 }
]);
// Export to CSV
const csv = await db.exportToCSV('users');
console.log('CSV Export:');
console.log(csv);
// Export to CSV file
await db.exportToCSV('users', { filePath: './exports/users.csv' });
// Export specific fields to CSV
const csvPartial = await db.exportToCSV('users', {
fields: ['id', 'name', 'email']
});
console.log('Partial CSV:', csvPartial);
// Export to SQL with schema
const sql = await db.exportToSQL('users', {
includeSchema: true,
database: 'postgresql'
});
console.log('SQL Export:');
console.log(sql);
// Export to SQL file
await db.exportToSQL('users', {
filePath: './exports/users.sql',
database: 'mysql'
});
}
exportExample();Export Features:
- CSV export with customizable delimiter
- SQL export supporting SQLite, MySQL, and PostgreSQL
- Optional file output
- Field selection for partial exports
- CREATE TABLE schema generation for SQL
- Handles nested objects by JSON stringifying them
- Proper escaping for special characters
Examples
const Shridb = require('shridb');
// Create database with caching enabled
const db = new Shridb('myApp', {
cache: true, // Enable caching
cacheTTL: 60000, // Cache data for 1 minute
cacheMaxSize: 50 // Cache up to 50 tables
});
async function cachingExample() {
// First read - loads from disk and caches
const users1 = await db.getTable('users');
// Second read - returns cached data (faster)
const users2 = await db.getTable('users');
// Check cache status
const stats = await db.getCacheStats();
console.log('Cache stats:', stats);
// Insert data - automatically invalidates cache
await db.insert('users', { name: 'John', email: '[email protected]' });
// Clear cache manually if needed
await db.clearCache('users');
// Disable caching
await db.disableCache();
}
cachingExample();Cache Features:
- LRU (Least Recently Used) eviction when cache is full
- Automatic cache invalidation on write operations (insert, update, delete)
- Configurable TTL (time-to-live) for cached data
- Thread-safe operations with existing locking mechanism
Examples
Complete CRUD Example
const Shridb = require('shridb');
const db = new Shridb('myApp');
// Create
async function createExample() {
await db.insert('users', [
{ id: 1, name: 'Alice', age: 25 },
{ id: 2, name: 'Bob', age: 30 }
]);
console.log('Users created');
}
// Read
async function readExample() {
const allUsers = await db.findAll('users', 'id');
const alice = await db.findOne('users', { name: 'Alice' });
console.log('All:', allUsers, 'Alice:', alice);
}
// Update
async function updateExample() {
await db.updateOne('users', { name: 'Alice' }, { age: 26 });
const updated = await db.findOne('users', { name: 'Alice' });
console.log('Updated:', updated);
}
// Delete
async function deleteExample() {
await db.removeOne('users', { name: 'Bob' });
console.log('Bob removed');
}
// Aggregate
async function aggregateExample() {
const maxAge = await db.getMax('users', 'age');
const minAge = await db.getMin('users', 'age');
const totalAge = await db.doSum('users', 'age');
const names = await db.map('users', 'name');
console.log('Max:', maxAge, 'Min:', minAge, 'Sum:', totalAge, 'Names:', names);
}
// Run all examples
async function main() {
await createExample();
await readExample();
await updateExample();
await deleteExample();
await aggregateExample();
}
main();Concurrency Control Example
const Shridb = require('shridb');
// Create database with custom lock settings
const db = new Shridb('myApp', {
lockTimeout: 10000, // Wait up to 10 seconds for lock
lockRetryInterval: 100 // Retry every 100ms
});
// Manual locking example
async function manualLockExample() {
// Lock table before operations
await db.lock('users');
try {
// Perform multiple operations atomically
await db.insert('users', { name: 'John' });
await db.updateOne('users', { name: 'John' }, { name: 'John Doe' });
await db.removeOne('users', { name: 'Jane' });
} finally {
// Always unlock after operations
await db.unlock('users');
}
}
// Using withLock for atomic transactions
async function transactionExample() {
const result = await db.withLock('users', async () => {
// These operations are protected by the lock
const current = await db.findAll('users');
// Insert new user
await db.insert('users', {
name: 'New User',
id: current.length + 1
});
return await db.findAll('users');
});
console.log('All users after transaction:', result);
}
// Check if table is locked
async function checkLockExample() {
const locked = await db.isLocked('users');
console.log('Is users table locked?', locked);
}
async function main() {
await manualLockExample();
await transactionExample();
await checkLockExample();
}
main();Testing
The test suite (test.js) has been refactored to use modern async/await syntax for improved readability and maintainability.
Run the existing tests:
npm testRun all tests including comprehensive ones:
npm run test:allExample: Using async/await in your code
const Shridb = require('shridb');
async function main() {
const db = new Shridb('myDatabase');
// Insert data
await db.insert('users', [
{ id: 1, name: 'John Doe', email: '[email protected]' },
{ id: 2, name: 'Jane Smith', email: '[email protected]' }
]);
// Query data
const user = await db.findOne('users', { name: 'John Doe' });
console.log('Found user:', user);
// Update data
await db.updateOne('users', { name: 'John Doe' }, { email: '[email protected]' });
// Delete data
await db.removeOne('users', { name: 'Jane Smith' });
}
main().catch(err => console.error('Error:', err));Contributing
Contributions are welcome! Please follow these steps:
- Fork the repository
- Clone your fork:
git clone https://github.com/YOUR_USERNAME/shridb.git - Create a branch:
git checkout -b my-feature - Make changes and test them
- Commit your changes:
git commit -m 'Add new feature' - Push to your fork:
git push origin my-feature - Submit a Pull Request
Please ensure all tests pass before submitting a PR.
FAQ
What is shridb?
shridb is a simple, promise-based JSON database for Node.js that treats each JSON file as a table, making it easy to store and query data.
How is data stored?
Data is stored as JSON files in a folder named after your database. Each table is a separate .json file.
Is shridb production-ready?
shridb is suitable for small to medium-sized applications. For large-scale applications with high concurrency, consider using databases like MongoDB, PostgreSQL, or Redis.
Does shridb support transactions?
While shridb doesn't support full database transactions, you can use the withLock() method to wrap multiple operations in a single atomic transaction. This ensures all operations complete before releasing the lock, preventing concurrent modifications.
Can I use shridb in the browser?
No, shridb is designed for Node.js environments (Node.js, NW.js, Electron). It requires file system access.
How do I handle concurrency?
shridb now includes built-in file locking to prevent data corruption when multiple processes access the same database. All write operations automatically acquire and release locks. For advanced use cases, you can manually lock tables using db.lock(), db.unlock(), or use db.withLock() to wrap multiple operations in a single transaction.
Support
If you find this project helpful, please consider supporting the developer:
- Donate via PayPal
- ⭐ Star the project on GitHub
- 🐛 Report issues on GitHub
Changelog
v1.8.0
- Export to CSV/SQL - Enable data portability and integration with other systems
exportToCSV(tableName, options)- Export table data to CSV formatexportToSQL(tableName, options)- Export table data to SQL INSERT statements- Supports SQLite, MySQL, and PostgreSQL formats
- Optional CREATE TABLE schema generation
- Customizable delimiters and field selection
- Optional file output
v1.7.0
- Data Compression - Reduce storage size with gzip compression
isCompressionEnabled()- Check if compression is enabledenableCompression()- Enable gzip compressiondisableCompression()- Disable gzip compressionsetCompressionLevel(level)- Set compression level (1-9)- Compressed files saved with
.json.gzextension - Automatic decompression when reading data
- Configurable compression level (default: 6)
- System files (_schemas.json, _sequences.json) remain uncompressed
v1.6.0
- In-Memory Caching - Improve read performance with LRU cache and TTL support
isCacheEnabled()- Check if cache is enabledenableCache()- Enable in-memory cachingdisableCache()- Disable in-memory cachingsetCacheTTL(ttl)- Set cache time-to-liveclearCache(tableName)- Clear cache for a table or all tablesgetCacheStats()- Get cache statistics- LRU eviction when cache is full
- Automatic cache invalidation on write operations
- Configurable cache TTL (default: 1 minute)
- Configurable max cache size (default: 100 tables)
v1.5.0
- Query Builder / Pagination - Combine filtering, sorting, and pagination in one operation
query(tableName)- Create a chainable query builder.where(query)- Filter results.orderBy(prop, order)- Sort results (asc/desc).limit(n)- Limit number of results.offset(n)- Skip results.exec()- Execute query.first()- Get first result.count()- Count matching recordspaginate(tableName, options)- Simple pagination with metadata
v1.4.0
- File Locking / Concurrency Control - Built-in file locking to prevent data corruption
lock(tableName, timeout)- Manually lock a table for exclusive accessunlock(tableName)- Release a lockisLocked(tableName)- Check if a table is lockedwithLock(tableName, fn)- Execute function with automatic lock management- All write operations automatically acquire and release locks
- Configurable lock timeout (default: 5000ms)
- Configurable retry interval (default: 50ms)
v1.3.0
- Schema Validation - Define schemas with validation rules
defineSchema(tableName, schema)- Define a schema for a tablegetSchema(tableName)- Get schema definitionremoveSchema(tableName)- Remove schema definitionhasSchema(tableName)- Check if schema exists- Automatic validation on
insert(),updateOne(), andupdateAll()operations - Supported validations: type, required, unique, primary, min, max, minLength, maxLength, pattern, enum, minItems, maxItems
v1.2.0
- Auto-Increment Primary Keys - Configure auto-increment for any field
setAutoIncrement(table, field, start)- Configure auto-incrementgetAutoIncrement(table, field)- Get current valueresetAutoIncrement(table, field)- Reset sequenceremoveAutoIncrement(table, field)- Remove configuration
- Improved insert() to support auto-increment values
v1.1.0
- Enhanced array manipulation methods
- Improved error handling
- Updated dependencies (lodash, fs-extra)
v1.0.x
- Initial release
- Basic CRUD operations
- Query methods (findOne, findAll, etc.)
- Backup and restore functionality
- Utilities
License
MIT License - see LICENSE file for details.
Copyright (c) 2020 Govind Sham Bhumkar
