@dataql/node
v0.9.1
Published
DataQL core SDK for unified data management with MongoDB and GraphQL - Production Multi-Cloud Ready
Maintainers
Readme
DataQL Node SDK
Installation
npm install @dataql/nodeQuick Start
import { Data } from "@dataql/node";
const data = new Data({
appToken: "app_sometoken",
});
// Data operations
const users = data.collection("users", userSchema);
await users.create({ name: "Alice", email: "[email protected]" });Features
- 🔌 Plugin Ecosystem: Comprehensive plugin system for extending DataQL functionality
- 🔍 Database Introspection: Automatically analyze existing databases and generate DataQL schemas
- ☁️ One-Click Cloud Migration: Migrate entire databases to DataQL cloud infrastructure instantly
- 📱 Offline-First Architecture: Work offline with automatic sync when reconnected
- ⚡ Real-Time Synchronization: Live updates across all connected clients
- 🔄 Zero-API-Change Migration: Drop-in replacement for existing ORMs and databases
- 🚀 Auto-Scaling Infrastructure: Handles traffic spikes automatically
- 🗄️ Multi-Database Support: MongoDB with expanding support for PostgreSQL, MySQL, SQLite
- 🔒 Type-Safe Operations: Full TypeScript support with compile-time validation
- 🎯 Document-Scoped API: Intuitive subdocument operations without MongoDB operators
🔌 Plugin Ecosystem
NEW: DataQL now features a revolutionary plugin system that transforms it into a extensible platform. Build database adapters, middleware, integrations, and extensions to create a powerful ecosystem around your data layer.
Plugin Types
- 🔗 Database Adapters: Support new database types (PostgreSQL, MySQL, SQLite, etc.)
- ⚙️ Middleware: Process requests and responses (auth, rate limiting, logging)
- 🔧 Extensions: Add new methods and functionality to DataQL
- 🪝 Hooks: React to DataQL events and lifecycle
- 🔌 Integrations: Connect with external services (analytics, notifications, etc.)
- 🎨 UI Components: Build dashboards and management interfaces
Quick Start
import { Data, Plugin } from "@dataql/node";
// Create a simple analytics plugin
class AnalyticsPlugin implements Plugin {
id = "analytics";
name = "Analytics Plugin";
version = "1.0.0";
type = "extension" as const;
methods = {
async trackEvent(eventName: string, properties?: any) {
console.log(`Event: ${eventName}`, properties);
// Send to analytics service
},
};
async initialize(context) {
context.logger.info("Analytics plugin initialized");
}
}
// Register and use plugins
const data = new Data({ appToken: "your-token" });
await data.registerPlugin(new AnalyticsPlugin());
await data.initializePlugins();
await data.applyExtensions();
// Use extended functionality (added by plugin)
await (data as any).trackEvent("user_signup", { userId: "123" });Database Adapter Example
import { DatabaseAdapterPlugin } from "@dataql/node";
class PostgreSQLAdapter implements DatabaseAdapterPlugin {
id = "postgresql-adapter";
name = "PostgreSQL Adapter";
version = "1.0.0";
type = "adapter" as const;
databaseType = "postgresql";
async initialize(context) {
// Setup PostgreSQL connection pool
}
async createConnection(connectionString: string) {
// Create PostgreSQL connection
}
async introspect(connection, options) {
// Analyze PostgreSQL database structure
}
async executeQuery(connection, query) {
// Execute SQL queries
}
}
// Register PostgreSQL support
await data.registerPlugin(new PostgreSQLAdapter(), {
connectionString: "postgresql://user:pass@localhost:5432/db",
});Middleware Example
import { MiddlewarePlugin } from "@dataql/node";
class AuthMiddleware implements MiddlewarePlugin {
id = "auth-middleware";
name = "Authentication Middleware";
type = "middleware" as const;
order = 1; // Execute first
async processRequest(request) {
const token = request.headers.authorization;
if (!token) throw new Error("Authentication required");
// Verify JWT and add user to request
request.user = await this.verifyToken(token);
return request;
}
}Integration Example
import { IntegrationPlugin } from "@dataql/node";
class SlackIntegration implements IntegrationPlugin {
id = "slack-integration";
name = "Slack Integration";
type = "integration" as const;
service = "slack";
async syncTo(data, options) {
// Send notification to Slack
await this.sendToSlack({
text: `DataQL Alert: ${data.message}`,
channel: options?.channel || "#general",
});
}
}Hook System
React to DataQL events automatically:
import { HookPlugin } from "@dataql/node";
class CacheInvalidationHook implements HookPlugin {
id = "cache-invalidation";
type = "hook" as const;
hooks = {
afterCreate: async (data, context) => {
await this.invalidateCache(data.collection);
},
afterUpdate: async (data, context) => {
await this.invalidateCache(data.collection);
},
};
}Plugin Management
// Get plugin information
const plugins = data.getPlugins();
console.log(
"Registered plugins:",
plugins.map((p) => p.name)
);
// Plugin statistics
const stats = data.getPluginStats();
console.log("Plugin stats:", stats);
// Check for specific plugins
if (data.hasPlugin("postgres-adapter")) {
console.log("PostgreSQL support available");
}
// Get plugins by type
const adapters = data.getPluginsByType("adapter");
const middleware = data.getPluginsByType("middleware");Plugin Discovery & Registry
import { PluginRegistry } from "@dataql/node";
const registry = new PluginRegistry();
// Search for plugins
const plugins = await registry.search("postgresql", "adapter");
// Install plugins
await registry.install("@yourorg/dataql-postgres-adapter");
// Publish your plugins
await registry.publish(myPlugin, manifest);Available Hooks
DataQL provides comprehensive hooks for plugin integration:
- Request/Response:
beforeRequest,afterRequest - CRUD Operations:
beforeCreate,afterCreate,beforeRead,afterRead,beforeUpdate,afterUpdate,beforeDelete,afterDelete - Transactions:
beforeTransaction,afterTransaction - Database:
beforeIntrospection,afterIntrospection,beforeMigration,afterMigration - Lifecycle:
onConnect,onDisconnect,onError - Schema:
onSchemaChange,onDataChange
Plugin Development
Create powerful plugins with full TypeScript support:
import { Plugin, PluginContext } from "@dataql/node";
export class MyPlugin implements Plugin {
id = "my-plugin";
name = "My Plugin";
version = "1.0.0";
description = "Custom plugin description";
author = "Your Name";
type = "extension" as const;
configSchema = {
apiKey: { type: "string", required: true },
endpoint: { type: "string", required: false },
};
async initialize(context: PluginContext) {
const { config, logger, events, utils } = context;
// Setup plugin functionality
logger.info("Plugin initialized");
// Listen to events
events.on("afterCreate", (data) => {
// React to data creation
});
}
async destroy() {
// Cleanup resources
}
}Plugin Package Structure
{
"name": "@yourorg/dataql-my-plugin",
"version": "1.0.0",
"dataql": {
"plugin": {
"type": "extension",
"entry": "./dist/index.js",
"name": "My Plugin",
"config": {
"apiKey": { "type": "string", "required": true }
}
}
},
"peerDependencies": {
"@dataql/node": "^0.7.0"
}
}See the complete Plugin Development Guide for detailed documentation and examples.
🔍 Database Introspection
NEW: DataQL can automatically analyze your existing database and generate schemas, allowing you to migrate without changing your data structure. The introspection follows DataQL's core architecture: SDK → Worker → Lambda → Database (no direct database connections from SDK).
import { Data } from "@dataql/node";
const data = new Data({
appToken: "your-app-token",
env: "dev",
});
// Introspect an existing MongoDB database
const result = await data.introspect("mongodb://localhost:27017/ecommerce", {
databaseName: "ecommerce",
sampleSize: 100,
includeIndexes: true,
excludeCollections: ["logs", "temp"],
});
if (result.success) {
console.log(`Found ${Object.keys(result.schemas).length} collections`);
// Schemas are automatically registered - use them immediately!
const users = data.collection("users", result.schemas.users);
const products = data.collection("products", result.schemas.products);
// All DataQL features now available on your existing data
const allUsers = await users.find();
await products.create({ name: "New Product", price: 99.99 });
}Introspection Benefits
- 🚀 Zero-API-change migration: Keep your existing database, gain DataQL benefits
- 🤖 Smart type inference: Automatically detects field types, formats, and constraints
- 📊 Index preservation: Maintains your existing database indexes and relationships
- 🔄 Multi-database support: MongoDB (available now), PostgreSQL, MySQL coming soon
- ⚡ Instant migration: From analysis to working DataQL code in seconds
Advanced Introspection
// Fine-tune introspection for large databases
const result = await data.introspect(databaseUrl, {
sampleSize: 50, // Documents to sample per collection
maxDepth: 3, // Nested object analysis depth
includeIndexes: true, // Analyze database indexes
excludeCollections: ["logs", "analytics"], // Skip large collections
includeCollections: ["users", "orders"], // Only analyze specific collections
});
// Use directly without Data instance
import { IntrospectionService } from "@dataql/node";
const result = await IntrospectionService.introspect(
{
url: "mongodb://localhost:27017/blog",
name: "blog",
},
{
sampleSize: 100,
includeIndexes: true,
}
);See the full introspection guide for complete documentation and examples.
☁️ One-Click Cloud Migration
REVOLUTIONARY: Migrate your entire database to DataQL's cloud infrastructure in ONE LINE OF CODE! Get instant access to offline-first capabilities, real-time sync, and auto-scaling.
// Migrate your entire database to DataQL cloud in one line!
const result = await data.migrateToCloud("mongodb://localhost:27017/ecommerce");
if (result.success) {
console.log(`✅ Migration completed!`);
console.log(`☁️ Cloud database: ${result.cloudDatabase.name}`);
console.log(`🔗 New connection: ${result.cloudDatabase.connectionString}`);
console.log(
`📊 Migrated: ${result.migration.collections} collections, ${result.migration.documents} documents`
);
// Your data is now available with DataQL superpowers!
// Schemas are auto-registered and ready to use immediately
const users = data.collection("users", result.schemas.users);
const products = data.collection("products", result.schemas.products);
// Everything works exactly the same, but now with:
// ✅ Offline-first capabilities
// ✅ Real-time synchronization
// ✅ Auto-scaling infrastructure
// ✅ Built-in caching and optimization
}Migration Features
- One-Click Migration: Entire database migration in a single method call
- Zero Downtime: Migrate without interrupting your application
- Data Validation: Automatic integrity checks during migration
- Rollback Support: 24-hour rollback window for peace of mind
- Instant Benefits: Immediate access to DataQL's powerful features
- Smart Optimization: Automatic performance tuning for your data patterns
Advanced Migration Options
const result = await data.migrateToCloud(
"mongodb://localhost:27017/ecommerce",
{
// Introspection options
sampleSize: 500,
includeIndexes: true,
excludeCollections: ["temp_logs", "cache_data"],
// Migration options
batchSize: 2000,
validateData: true,
preserveIds: true,
createBackup: true,
}
);
// Real-world production migration
const prodResult = await data.migrateToCloud(
"mongodb+srv://user:[email protected]/production",
{
excludeCollections: ["sessions", "temp_data", "audit_logs"],
batchSize: 1000,
validateData: true,
preserveIds: true,
}
);
if (prodResult.success) {
console.log("🎉 Production migration successful!");
console.log(`📊 Migrated ${prodResult.migration.collections} collections`);
console.log(`⏱️ Duration: ${prodResult.migration.duration}ms`);
// Update your environment variables:
console.log(`DATABASE_URL=${prodResult.cloudDatabase.connectionString}`);
console.log(`MIGRATION_ID=${prodResult.migrationId}`);
}Why Migrate to DataQL Cloud?
- 🚀 Instant Performance: Auto-scaling infrastructure handles any load
- 📱 Offline-First: Apps work perfectly without internet connection
- 🔄 Real-Time Sync: Live updates across all connected clients
- 🔒 Built-in Security: Enterprise-grade security and compliance
- 🌍 Global CDN: Fast access from anywhere in the world
- 🧠 Smart Caching: Intelligent caching reduces database load
- 🔧 Zero Maintenance: No server management or scaling concerns
Example: E-commerce Migration
// Real-world e-commerce platform migration
const result = await data.migrateToCloud(
"mongodb+srv://admin:[email protected]/ecommerce_prod",
{
excludeCollections: ["user_sessions", "audit_logs", "temp_cart_items"],
batchSize: 2000,
includeIndexes: true,
validateData: true,
preserveIds: true,
}
);
if (result.success) {
// Your e-commerce platform now has:
// ✅ Offline-first shopping experience
// ✅ Real-time inventory synchronization
// ✅ Auto-scaling for Black Friday traffic
// ✅ Global CDN for worldwide customers
// ✅ Built-in conflict resolution for concurrent orders
// Immediate usage of migrated data
const products = data.collection("products", result.schemas.products);
const orders = data.collection("orders", result.schemas.orders);
// Create a new product to test the migrated system
await products.create({
name: "DataQL Migration Success T-Shirt",
price: 29.99,
category: "Apparel",
description: "Celebrate your successful database migration!",
});
}For comprehensive migration examples, see examples/cloud-migration.ts.
Example: Document-Scoped Subdocument Operations
const users = data.collection("users", userSchema);
// Create a user
const newUser = await users.create({
name: "Alice",
email: "[email protected]",
});
// Document-scoped subdocument operations
const user = users({ id: newUser.insertedId });
// Work with subdocuments
await user.addresses.create({
street: "123 Main St",
city: "New York",
zipCode: "10001",
});
await user.addresses.update({ city: "New York" }, { zipCode: "10002" });
const nyAddresses = await user.addresses.find({ city: "New York" });
await user.preferences.update({
theme: "dark",
notifications: true,
});For more examples, see examples/document-scoped-usage.ts.
Unique Document Creation
DataQL provides createUnique() method for both collections and subcollections to prevent duplicate documents based on comparable fields (excluding ID and subdocument fields).
Collection-level createUnique
const users = data.collection("users", userSchema);
// Create a unique user (will create new document)
const result1 = await users.createUnique({
name: "John Doe",
email: "[email protected]",
age: 30,
preferences: {
theme: "dark",
notifications: true,
},
});
console.log(result1.isExisting); // false
console.log(result1.insertedId); // "uuid-123"
// Try to create the same user again (will return existing)
// Comparison excludes 'id', 'createdAt', 'updatedAt', and subdocuments like 'preferences'
const result2 = await users.createUnique({
name: "John Doe", // Same
email: "[email protected]", // Same
age: 30, // Same
preferences: {
theme: "light", // Different but ignored (subdocument)
notifications: false,
},
});
console.log(result2.isExisting); // true
console.log(result2.insertedId); // "uuid-123" (same as result1)Subcollection createUnique
const user = users({ id: userId });
// Create unique address
const address1 = await user.addresses.createUnique({
street: "123 Main St",
city: "New York",
zipCode: "10001",
country: "USA",
});
console.log(address1.isExisting); // false
// Try to create the same address (will return existing)
const address2 = await user.addresses.createUnique({
street: "123 Main St",
city: "New York",
zipCode: "10001",
country: "USA",
});
console.log(address2.isExisting); // trueExcluded from Comparison
The createUnique method excludes these fields from uniqueness comparison:
- ID fields:
id,_id, and any field containing 'id' - Auto-generated timestamps:
createdAt,updatedAt - Subdocument objects: Nested objects like
preferences,profile - Subdocument arrays: Arrays of objects like
addresses,reviews
Compared Fields
Only these fields are compared for uniqueness:
- Primitive fields: strings, numbers, booleans
- Enum fields: category selections
- Simple arrays: arrays of primitive values (if any)
Return Value
type CreateUniqueResult = {
insertedId?: string; // ID of document (new or existing)
result?: any; // Full document data
isExisting?: boolean; // true if existed, false if newly created
};For complete examples, see examples/unique-document-creation.ts.
Collection References
DataQL supports defining references between collections using the Ref() function, enabling relational data modeling with type safety.
Quick Summary
To define references between collections in DataQL:
- Import
Ref:import { Ref } from "@dataql/node" - Define reference fields:
ownerId: Ref(userSchema)orassigneeId: Ref("User") - Arrays of references:
memberIds: [Ref(userSchema)] - Use in operations: Store actual IDs as values:
{ ownerId: "user-123" }
Defining References
import { Data, ID, String, Ref } from "@dataql/node";
// Define schemas
const userSchema = {
id: ID,
name: String,
email: String,
} as const;
const projectSchema = {
id: ID,
name: String,
description: String,
ownerId: Ref(userSchema), // Reference to user schema
teamMemberIds: [Ref(userSchema)], // Array of user references
} as const;
const taskSchema = {
id: ID,
title: String,
completed: Boolean,
userId: Ref(userSchema), // Reference to user
projectId: Ref(projectSchema), // Reference to project
assigneeId: Ref("User"), // Reference by string name
} as const;Reference Syntax Options
// 1. Schema Object Reference (Recommended)
ownerId: Ref(userSchema), // Direct schema reference
// 2. String Reference
assigneeId: Ref("User"), // Reference by collection name
// 3. Array of References
memberIds: [Ref(userSchema)], // Array of user references
tagIds: [Ref("Tag")], // Array of string references
// 4. Nested Reference Objects
participants: [
{
userId: Ref(userSchema),
role: ["admin", "member", "viewer"],
joinedAt: "Timestamp",
}
],Working with Referenced Data
const data = new Data({ appToken: "your-token" });
const users = data.collection("users", userSchema);
const projects = data.collection("projects", projectSchema);
const tasks = data.collection("tasks", taskSchema);
// Create referenced documents
const user = await users.create({
name: "Alice Johnson",
email: "[email protected]",
});
const project = await projects.create({
name: "Website Redesign",
description: "Complete redesign of company website",
ownerId: user.insertedId, // Reference the user's ID
teamMemberIds: [user.insertedId], // Array of user IDs
});
const task = await tasks.create({
title: "Design homepage mockup",
completed: false,
userId: user.insertedId, // Reference to user
projectId: project.insertedId, // Reference to project
});Cross-Collection Queries
// Find tasks for a specific user
const userTasks = await tasks.find({ userId: user.insertedId });
// Find projects owned by a user
const userProjects = await projects.find({ ownerId: user.insertedId });
// Find all team members for a project
const project = await projects.find({ id: projectId });
const teamMembers = await users.find({
id: { $in: project[0].teamMemberIds },
});Reference Validation
DataQL automatically validates that referenced IDs exist when creating or updating documents:
// This will succeed if the user exists
await tasks.create({
title: "New task",
userId: "existing-user-id",
projectId: "existing-project-id",
});
// This will fail with a validation error
await tasks.create({
title: "Invalid task",
userId: "non-existent-user-id", // Validation error
});Document-Scoped Operations with References
// Work with referenced data using document-scoped API
const project = projects({ id: projectId });
// Add team members to project
await project.update({
teamMemberIds: [...existingMemberIds, newUserId],
});
// Find tasks for this project
const projectTasks = await tasks.find({ projectId: projectId });
// Update task assignments
const task = tasks({ id: taskId });
await task.update({ userId: newAssigneeId });Advanced Reference Patterns
// Self-referencing (e.g., user management hierarchy)
const userSchema = {
id: ID,
name: String,
managerId: Ref("User"), // Self-reference
directReports: [Ref("User")], // Array of self-references
} as const;
// Polymorphic references (reference to multiple collection types)
const commentSchema = {
id: ID,
content: String,
authorId: Ref(userSchema),
// Reference different entity types
entityType: ["task", "project", "user"],
entityId: String, // ID of the referenced entity
} as const;
// Many-to-many relationships through junction collections
const userProjectSchema = {
id: ID,
userId: Ref(userSchema),
projectId: Ref(projectSchema),
role: ["owner", "member", "viewer"],
joinedAt: "Timestamp",
} as const;Reference Benefits
- Type Safety: Full TypeScript inference for referenced fields
- Validation: Automatic validation of reference integrity
- Performance: Optimized queries for related data
- Consistency: Maintains data relationships across collections
- Flexibility: Support for various reference patterns and relationships
For complete examples, see examples/simple-references.ts and examples/collection-references.ts.
Schema Composition Benefits
Schema composition provides several advantages:
- Reusability: Define common structures once, use everywhere
- Maintainability: Update schema in one place, changes propagate
- Type Safety: Full TypeScript inference across composed schemas
- Consistency: Ensure uniform data structures across collections
- Modularity: Build complex schemas from simple building blocks
Using Composed Schemas
// Create collections using composed schemas
const users = data.collection("users", userSchema);
const accounts = data.collection("accounts", accountSchema);
const organizations = data.collection("organizations", organizationSchema);
// Create documents with composed subdocuments
await accounts.create({
accountName: "Enterprise Account",
users: [
{ name: "John Doe", email: "[email protected]" },
{ name: "Jane Smith", email: "[email protected]" },
],
billingAddress: {
street: "123 Business Ave",
city: "Enterprise City",
zipCode: "12345",
country: "USA",
},
shippingAddresses: [
{
street: "456 Shipping St",
city: "Warehouse City",
zipCode: "67890",
country: "USA",
},
],
});Working with Subdocuments
Creating Documents with Subdocuments
const users = data.collection("users", userSchema);
const newUser = await users.create({
name: "John Doe",
email: "[email protected]",
preferences: {
theme: "dark",
notifications: true,
language: "en",
},
addresses: [
{
street: "123 Main St",
city: "New York",
zipCode: "10001",
country: "USA",
},
{
street: "456 Oak Ave",
city: "San Francisco",
zipCode: "94102",
country: "USA",
},
],
profile: {
bio: "Software developer",
social: {
twitter: "@johndoe",
github: "johndoe",
},
work: {
company: "Tech Corp",
position: "Senior Developer",
address: {
street: "789 Tech Blvd",
city: "Silicon Valley",
country: "USA",
},
},
},
});Working with Composed Subdocuments
const accounts = data.collection("accounts", accountSchema);
// Document-scoped operations on composed subdocuments
const account = accounts({ id: accountId });
// Add new user to account
await account.users.create({
name: "New User",
email: "[email protected]",
});
// Update billing address (composed addressSchema)
await account.billingAddress.update({
street: "Updated Street",
city: "New City",
});
// Add new shipping address
await account.shippingAddresses.create({
street: "Additional Shipping St",
city: "Shipping City",
zipCode: "99999",
country: "USA",
});Querying Subdocuments
// Find users by nested field
const users = await users.find({
"preferences.theme": "dark",
});
// Find users by deeply nested field
const techUsers = await users.find({
"profile.work.company": "Tech Corp",
});
// Find users by array subdocument field
const nyUsers = await users.find({
"addresses.city": "New York",
});Updating Subdocuments
const users = data.collection("users", userSchema);
const user = users({ id: userId });
// Update nested object fields directly
await user.preferences.update({
theme: "light",
});
await user.profile.update({
bio: "Senior Software Developer",
});
// Update entire nested object
await user.preferences.update({
theme: "light",
notifications: false,
language: "es",
});
// Add to array of subdocuments (document-scoped API)
await user.addresses.create({
street: "789 New St",
city: "Boston",
zipCode: "02101",
country: "USA",
});
// Update specific subdocuments in array
await user.addresses.update({ city: "Boston" }, { zipCode: "02102" });Finding and Updating Subdocuments by Field
DataQL provides powerful document-scoped operations for working with subdocuments in an intuitive way.
Document-Scoped Subdocument API
const users = data.collection("users", userSchema);
// Document-scoped subdocument operations
const user = users({ id: userId });
// Create new subdocuments
await user.addresses.create({
street: "123 Boston St",
city: "Boston",
zipCode: "02101",
country: "USA",
});
// Find specific addresses within a user's document
const bostonAddresses = await user.addresses.find({ city: "Boston" });
// Update specific subdocuments by field
await user.addresses.update(
{ city: "Boston" }, // Find criteria
{ zipCode: "02102" } // Update data
);
// Delete specific subdocuments
await user.addresses.delete({ city: "Boston" });
// Batch operations on subdocuments
await user.addresses.updateMany({ country: "USA" }, { isActive: true });
await user.addresses.deleteMany({ isActive: false });
// Nested subdocument updates
await user.profile.update({ bio: "Updated bio text" });
// Variable-based approach for multiple operations
const userAddresses = user.addresses;
await userAddresses.create({ street: "456 Main St", city: "NYC" });
await userAddresses.update({ city: "NYC" }, { zipCode: "10001" });
const nycAddresses = await userAddresses.find({ city: "NYC" });Benefits of Document-Scoped API
- More Intuitive: Natural object-oriented operations instead of MongoDB operators
- Type Safe: Full TypeScript support for subdocument fields
- Document Context: Clear parent-child relationship with automatic scoping
- Cleaner Syntax: No need for
$push,$pull,$positional operators - Variable References: Store document/subdocument instances for cleaner multi-operation code
Schema Definition
Basic Schema
const userSchema = {
id: "ID",
name: "String",
email: "String",
age: "Int",
isActive: "Boolean",
createdAt: "Date",
} as const;Auto-Generated Fields
DataQL automatically adds essential metadata fields to all schemas if they're not explicitly defined:
// You define this minimal schema
const userSchema = {
name: "String",
email: "String",
} as const;
// DataQL automatically expands it to include:
// {
// id: "ID", // Auto-added if not present
// name: "String",
// email: "String",
// createdAt: "Timestamp", // Auto-added if not present
// updatedAt: "Timestamp", // Auto-added if not present
// }Auto-generated fields:
id: Unique identifier (UUID v4 string) - added if not specifiedcreatedAt: Creation timestamp - added if not specifiedupdatedAt: Last modification timestamp - added if not specified
This ensures all documents have consistent metadata without requiring boilerplate in every schema definition.
When to explicitly define these fields:
- Custom ID format: If you need a specific ID format (e.g.,
orderId: "String"for custom order numbers) - Different timestamp types: If you prefer
DateoverTimestampfor certain use cases - Field constraints: If you need validation rules or default values for these fields
// Explicit definition example
const customSchema = {
orderId: "String", // Custom ID instead of auto-generated UUID
name: "String",
email: "String",
createdAt: "Date", // Use Date instead of Timestamp
// updatedAt will still be auto-added as Timestamp
} as const;Enum Field Definitions
DataQL supports multiple syntaxes for defining enum fields:
const userSchema = {
name: "String",
// Simple array syntax (recommended for clean schemas)
role: ["admin", "user", "guest"],
status: ["active", "inactive", "pending"],
// Explicit enum object syntax (for additional options)
priority: { enum: ["low", "medium", "high"] },
theme: { enum: ["light", "dark"], required: true },
// Mixed usage is perfectly valid
department: ["engineering", "sales", "marketing"],
level: { enum: ["junior", "mid", "senior"], required: true },
} as const;Benefits of simplified enum syntax:
- Cleaner code:
["one", "two", "three"]vs{ enum: ["one", "two", "three"] } - Less verbose: Reduces boilerplate for simple enums
- TypeScript friendly: Full type inference and autocompletion
- Backward compatible: Existing
{ enum: [...] }syntax still works
Subdocuments (Nested Objects)
DataQL supports multiple ways to define subdocuments and nested data structures:
1. Simple Nested Objects
const userSchema = {
id: "ID",
name: "String",
email: "String",
preferences: {
theme: "String",
notifications: "Boolean",
language: "String",
},
} as const;2. Arrays of Subdocuments
const userSchema = {
id: "ID",
name: "String",
addresses: [
{
street: "String",
city: "String",
zipCode: "String",
country: "String",
},
],
} as const;3. Deep Nested Structures
const userSchema = {
id: "ID",
name: "String",
profile: {
bio: "String",
social: {
twitter: "String",
github: "String",
linkedin: "String",
},
work: {
company: "String",
position: "String",
address: {
street: "String",
city: "String",
country: "String",
},
},
},
} as const;4. Schema Composition and Reusability
// Define reusable schemas (IDs auto-added by DataQL)
const addressSchema = {
street: "String",
city: "String",
zipCode: "String",
country: "String",
} as const;
const userSchema = {
name: "String",
email: "String",
} as const;
// Compose schemas using references (IDs auto-added)
const accountSchema = {
accountName: "String",
users: [userSchema], // Array of user subdocuments
billingAddress: addressSchema, // Single address subdocument
shippingAddresses: [addressSchema], // Array of address subdocuments
} as const;
const organizationSchema = {
name: "String",
accounts: [accountSchema], // Nested composition
headquarters: addressSchema,
employees: [userSchema],
} as const;5. Simplified Enum Syntax
// DataQL supports multiple enum syntaxes
const productSchema = {
name: "String",
category: ["electronics", "clothing", "books"], // Array shorthand
status: { enum: ["active", "inactive"] }, // Explicit enum object
priority: { enum: ["low", "medium", "high"], required: true }, // With options
} as const;6. Mixed Complex Schema
const orderSchema = {
customerId: "String",
status: ["pending", "completed", "cancelled"], // Simplified enum syntax
items: [
{
productId: "String",
name: "String",
quantity: "Int",
price: "Decimal",
metadata: {
category: "String",
tags: ["String"],
},
},
],
shipping: {
address: {
street: "String",
city: "String",
zipCode: "String",
},
method: "String",
trackingNumber: "String",
},
payment: {
method: "String",
amount: "Decimal",
currency: "String",
processor: {
name: "String",
transactionId: "String",
metadata: "Object", // For flexible JSON data
},
},
createdAt: "Date",
updatedAt: "Date",
} as const;Advanced Usage
- You can use
DataandBaseDataQLClientclasses directly for advanced scenarios. - Supports transactions, bulk operations, and complex queries.
- Compatible with existing database adapters (Drizzle, Prisma, Mongoose, etc.).
Dual ESM + CommonJS Support
DataQL Node SDK supports both ESM and CommonJS out of the box (Node.js 16+):
- ESM Usage:
import { Data } from "@dataql/node"; const data = new Data({ appToken: "your-token" }); - CommonJS Usage:
const { Data } = require("@dataql/node"); const data = new Data({ appToken: "your-token" });
The SDK uses the exports field in package.json to provide the correct entrypoint for both module types. No extra configuration is needed.
- ESM entry:
./dist/esm/index.js - CJS entry:
./dist/cjs/index.js
Note: Do not add type: "module" to your own package.json unless you want your project to be ESM-only.
Note: The SDK no longer accepts a
workerUrlparameter. All requests are routed through the built-in DataQL worker URL (https://edge.dataql.com). If you need to override routing for advanced use cases, use thecustomConnectionoption.
