@asaidimu/anansi
v4.0.2
Published
A toolkit for advanced data modelling
Readme
Anansi: A Schema-Driven Data Modeling Toolkit
Anansi is a comprehensive TypeScript toolkit for defining, versioning, migrating, and persisting structured data, enabling schema-driven development with powerful runtime validation and adaptable storage layers.
Table of Contents
- Overview & Features
- Installation & Setup
- Usage Documentation
- Project Architecture
- Development & Contributing
- Additional Information
Overview & Features
Anansi is designed to streamline data management in TypeScript applications by providing a robust and flexible framework for defining data models, ensuring data integrity, and managing schema evolution over time. It abstracts away the complexities of various persistence layers, allowing developers to focus on the business logic while maintaining strong type safety and consistency.
Whether you're building a new application from scratch or need to bring order to an existing, evolving data landscape, Anansi offers the tools to define clear, verifiable data structures, automate schema updates, and integrate seamlessly with diverse storage solutions. Its modular design promotes extensibility, allowing for easy integration of new features and custom adapters.
Key Features
- Declarative Schema Definition: Define complex data models using a rich
SchemaDefinitioninterface, supporting primitive types, arrays, sets, enums, objects, discriminated unions, and relationships. Includes support for custom constraints, indexes, and descriptive metadata. - Comprehensive Versioning & Migration: Manage schema evolution gracefully with a built-in migration engine. Define schema changes and data transformation functions to automatically upgrade or rollback data between versions, ensuring data consistency across deployments.
- Pluggable Persistence Layer: Interact with various data storage backends through a unified
PersistenceAPI. Comes with an in-memory ephemeral store for rapid prototyping and easily extendable to support other databases. - Runtime Data Validation: Automatically generate TypeScript validators from your
SchemaDefinitionto enforce data integrity at runtime. Integrates with standard validation specifications (@standard-schema/spec) and can be adapted for popular form libraries like React Hook Form. - Git-Powered Schema Registry: Store and manage your
SchemaDefinitionfiles in a version-controlled, collaborative registry. The Git integration (isomorphic-gitandLightningFS) allows for distributed schema management, branching, tagging, and synchronization with remote repositories like GitHub or Gitea. - Eventing & Observability: Hook into persistence operations with a powerful event bus. Register triggers and schedule tasks to automate workflows, react to data changes, or perform maintenance operations. Access comprehensive metadata for monitoring your collections.
- Developer Tooling: Boost productivity with utility functions for generating TypeScript types directly from your schemas, creating human-readable Markdown documentation, and applying JSON Patch operations for granular data updates.
Installation & Setup
Prerequisites
To use Anansi, ensure you have the following installed:
- Node.js: v18.x or higher
- Bun: (Recommended for running scripts and faster dependency installation) or
npm/yarn - TypeScript: v5.x or higher
Installation Steps
Install Anansi in your project using Bun (recommended) or your preferred package manager:
# Using Bun
bun add @asaidimu/anansi
# Using npm
npm install @asaidimu/anansi
# Using yarn
yarn add @asaidimu/anansiConfiguration
Anansi is designed to be highly configurable through its API. For internal development, modules are often aliased with @core to src/. If you are importing from the distributed package, you'll use @asaidimu/anansi.
No global configuration files are strictly required, but specific persistence adapters will require their own configuration (e.g., API URLs, authentication tokens).
Verification
You can quickly verify the installation by trying a simple import in a TypeScript file:
import { createEphemeralPersistence } from '@asaidimu/anansi';
import type { SchemaDefinition } from '@asaidimu/anansi';
// Define a simple schema
const userSchema: SchemaDefinition = {
name: 'User',
version: '1.0.0',
fields: {
id: { name: 'id', type: 'string', required: true },
name: { name: 'name', type: 'string' }
},
nestedSchemas: {} // Always required, even if empty
};
// Create an in-memory persistence instance
const persistence = createEphemeralPersistence({}, {}); // Requires functionMap and predicateMap (can be empty for basic usage)
async function verify() {
try {
const userCollection = await persistence.create<typeof userSchema>({ schema: userSchema });
console.log(`Collection '${userCollection.schema().name}' created successfully.`);
const collections = await persistence.collections();
console.log('Available collections:', collections);
} catch (error) {
console.error('Verification failed:', error);
}
}
verify();Usage Documentation
Core Concepts
Anansi revolves around several core concepts:
SchemaDefinition: The blueprint for your data. It describes fields, their types, constraints, indexes, and nested structures.Persistence: The high-level interface for interacting with data stores (e.g., creating collections, managing schemas globally).PersistenceCollection: An instance tied to a specific schema/collection, providing CRUD, validation, and migration operations for that data.Migration: A record of how a schema has evolved, including both structuralchangesandDataTransformfunctions to convert data between versions.SchemaRegistry: A system for storing, managing, and retrievingSchemaDefinitionfiles and theirMigrationhistory. This can be in-memory or backed by Git.
Defining Schemas
Schemas are defined using the SchemaDefinition interface. Here's an example:
import type { SchemaDefinition, FieldType } from '@asaidimu/anansi';
const addressSchema: SchemaDefinition['nestedSchemas']['address'] = {
name: 'Address', // Name used for referencing this nested schema
description: 'Represents a physical address',
fields: {
street: { name: 'street', type: 'string', required: true, description: 'Street address line' },
city: { name: 'city', type: 'string', required: true },
zipCode: { name: 'zipCode', type: 'string', required: true, hint: { input: { type: 'text', placeholder: 'e.g., 90210' } } }
}
};
const userSchema: SchemaDefinition = {
name: 'User',
version: '1.0.0',
description: 'Defines a user profile',
fields: {
id: { name: 'id', type: 'string', required: true, unique: true, description: 'Unique user identifier' },
firstName: { name: 'firstName', type: 'string', required: true },
lastName: { name: 'lastName', type: 'string', required: true },
email: {
name: 'email',
type: 'string',
required: true,
unique: true,
constraints: [{ name: 'isEmailFormat', predicate: 'regex', parameters: /^[^\s@]+@[^\s@]+\.[^\s@]+$/, errorMessage: 'Must be a valid email format' }],
hint: { input: { type: 'email' } }
},
status: {
name: 'status',
type: 'enum',
values: ['active', 'inactive', 'suspended'],
default: 'active',
description: 'Current status of the user',
hint: { input: { type: 'select', options: [{value: 'active', label: 'Active'}, {value: 'inactive', label: 'Inactive'}, {value: 'suspended', label: 'Suspended'}] } }
},
address: {
name: 'address',
type: 'object',
schema: { id: 'address' }, // Reference to the nested schema
required: false,
description: 'User\'s residential address'
},
tags: {
name: 'tags',
type: 'set', // Ensures unique items
itemsType: 'string',
default: [],
description: 'A set of keywords associated with the user'
}
},
nestedSchemas: {
address: addressSchema // Define the nested schema here
},
indexes: [
{ name: 'emailIndex', fields: ['email'], type: 'unique', description: 'Ensure email uniqueness' },
{ name: 'nameCompositeIndex', fields: ['lastName', 'firstName'], type: 'composite' }
],
constraints: [
{
name: 'fullNameLength',
operator: 'and',
rules: [
{ name: 'firstNameMinLength', predicate: 'minLength', field: 'firstName', parameters: 2, errorMessage: 'First name too short' },
{ name: 'lastNameMinLength', predicate: 'minLength', field: 'lastName', parameters: 2, errorMessage: 'Last name too short' }
]
}
],
// Example of a mock data generator (uses @faker-js/faker)
mock: (faker) => ({
id: faker.string.uuid(),
firstName: faker.person.firstName(),
lastName: faker.person.lastName(),
email: faker.internet.email(),
status: faker.helpers.arrayElement(['active', 'inactive', 'suspended']),
address: {
street: faker.location.streetAddress(),
city: faker.location.city(),
zipCode: faker.location.zipCode()
},
tags: Array.from({length: faker.number.int({min: 1, max: 3})}, () => faker.lorem.word())
})
};Schema Registry
The SchemaRegistry is responsible for storing and managing your SchemaDefinition files and their versions.
Core SchemaRegistry (In-Memory)
import { SchemaRegistry, type SchemaDefinition } from '@asaidimu/anansi';
const registry = new SchemaRegistry('/my-app-schemas'); // Uses LightningFS in browser, local files in Node.js
async function demoRegistry() {
await registry.init(); // Initialize the registry structure
const productSchema: SchemaDefinition = {
name: 'Product',
version: '1.0.0',
fields: {
id: { name: 'id', type: 'string', required: true },
name: { name: 'name', type: 'string' },
price: { name: 'price', type: 'number' }
},
nestedSchemas: {}
};
await registry.create({ schema: productSchema });
console.log('Product schema created.');
const schemas = await registry.list();
console.log('Available schemas:', schemas);
const fetchedSchema = await registry.schema({ name: 'Product' });
console.log('Fetched Product schema:', fetchedSchema?.version);
// Update schema
const updatedProductSchema = { ...productSchema, version: '1.1.0', description: 'Updated product schema' };
await registry.update({ schema: updatedProductSchema });
console.log('Product schema updated to version:', (await registry.schema({ name: 'Product' }))?.version);
const history = await registry.history({ name: 'Product' });
console.log('Product schema history:', history.map(s => s.version));
// Sync the registry (updates internal lockfile hashes)
await registry.sync();
console.log('Registry synced.');
}
demoRegistry();Git-Enabled SchemaRegistry
For persistent storage, collaboration, and advanced versioning, createGitSchemaRegistry wraps the core SchemaRegistry with isomorphic-git. This allows pushing/pulling schemas to a remote Git repository (e.g., GitHub, Gitea).
import { createGitSchemaRegistry, SchemaDefinition } from '@asaidimu/anansi';
import { createGithubRepository } from '@asaidimu/anansi/lib/registry/github';
// IMPORTANT: Replace with your actual GitHub credentials and desired repository details
const githubRemote = await createGithubRepository({
username: 'your-github-username',
password: 'your-github-personal-access-token', // PAT with repo scope
repository: 'anansi-schemas-repo', // The repo name to use on GitHub
create: true // Will create the repo if it doesn't exist
});
async function demoGitRegistry() {
const gitRegistry = await createGitSchemaRegistry(
'/git-registry', // Local directory for the Git clone
{
remote: githubRemote,
mainBranch: 'main',
createRemote: true, // Auto-create remote repo if not exists
author: { name: 'Anansi Bot', email: '[email protected]' },
proxy: 'https://cors.isomorphic-git.org' // Optional CORS proxy for browser environments
}
);
await gitRegistry.init(); // Initialize local Git repository
const orderSchema: SchemaDefinition = {
name: 'Order',
version: '1.0.0',
fields: {
orderId: { name: 'orderId', type: 'string', required: true },
amount: { name: 'amount', type: 'number' }
},
nestedSchemas: {}
};
// Create schema and push to remote
await gitRegistry.create({ schema: orderSchema });
console.log('Order schema created and pushed to Git registry.');
// Update schema and push to remote
const updatedOrderSchema = { ...orderSchema, version: '1.1.0', fields: { ...orderSchema.fields, status: { name: 'status', type: 'string' } } };
await gitRegistry.update({ schema: updatedOrderSchema });
console.log('Order schema updated and pushed to Git registry.');
// Pull latest changes from remote (e.g., if another user pushed)
await gitRegistry.sync();
console.log('Git registry synced with remote.');
// Delete schema and clean up remote tags/branches
await gitRegistry.delete({ name: 'Order' });
console.log('Order schema deleted from Git registry and remote.');
}
// In a real application, ensure you handle authentication tokens securely
// and avoid hardcoding them.
// demoGitRegistry();Schema Evolution (Migrations)
Anansi provides a robust MigrationEngine to manage schema changes and transform data.
import { MigrationEngine, DataTransform, SchemaDefinition } from '@asaidimu/anansi';
import { createSchemaMigrationHelper } from '@asaidimu/anansi';
// Assume initial schema (e.g., loaded from SchemaRegistry)
let currentSchema: SchemaDefinition = {
name: 'LegacyUser',
version: '1.0.0',
fields: {
legacyId: { name: 'legacyId', type: 'string', required: true },
oldName: { name: 'oldName', type: 'string' }
},
nestedSchemas: {}
};
const migrationEngine = new MigrationEngine(currentSchema);
async function performMigration() {
// Define a migration helper for version 1.1.0
const helper = createSchemaMigrationHelper(currentSchema);
// Schema changes: Rename field, add new field
helper.modifyField('oldName', { name: 'fullName' }); // Renaming field name, not property key
helper.addField('email', { name: 'email', type: 'string', required: true });
// Data transform: Map oldName to fullName, add a default email
const transform: DataTransform<any, any> = {
forward: (data) => ({
id: data.legacyId, // Assuming 'id' is a new concept mapped from legacyId
fullName: data.oldName,
email: data.oldName.toLowerCase().replace(/\s/g, '.') + '@example.com'
}),
backward: (data) => ({
legacyId: data.id,
oldName: data.fullName,
// Cannot reliably reverse email generation, so might need placeholder
})
};
const { migrate, rollback } = helper.changes(); // Get both forward and backward changes
// Add the migration to the engine
await migrationEngine.add({
description: 'Rename oldName to fullName and add email',
changes: migrate,
rollback: rollback,
transform: transform
});
// Example initial data
let data = [{ legacyId: 'A1', oldName: 'John Doe' }, { legacyId: 'B2', oldName: 'Jane Smith' }];
let dataStream = new ReadableStream({
start(controller) {
data.forEach(item => controller.enqueue(item));
controller.close();
}
});
console.log('--- Dry Run Migration ---');
const { newSchema: dryRunSchema, dataPreview: dryRunPreview } = await migrationEngine.dryRun(dataStream, 'forward');
const previewData = await new Response(dryRunPreview).json(); // Consume stream
console.log('Simulated New Schema:', dryRunSchema);
console.log('Simulated Data Preview:', previewData);
// Now, apply the actual migration
console.log('\n--- Applying Migration ---');
const transformedStream = await migrationEngine.migrate(dataStream);
const transformedData = await new Response(transformedStream).json(); // Consume stream
console.log('Transformed Data:', transformedData);
currentSchema = migrationEngine.data().schema; // Update current schema in place
console.log('Actual New Schema Version:', currentSchema.version);
// Now, rollback the migration (using the updated currentSchema)
console.log('\n--- Rolling Back Migration ---');
const rollbackEngine = new MigrationEngine(currentSchema, migrationEngine.data().migrations, migrationEngine.data().history);
dataStream = new ReadableStream({
start(controller) {
transformedData.forEach(item => controller.enqueue(item));
controller.close();
}
});
const rolledBackStream = await rollbackEngine.rollback(dataStream);
const rolledBackData = await new Response(rolledBackStream).json();
console.log('Rolled Back Data:', rolledBackData);
currentSchema = rollbackEngine.data().schema;
console.log('Rolled Back Schema Version:', currentSchema.version);
}
performMigration();Runtime Validation
Anansi generates runtime validators based on your schema definitions.
import { createStandardSchemaValidator, type SchemaDefinition } from '@asaidimu/anansi';
const productSchema: SchemaDefinition = {
name: 'Product',
version: '1.0.0',
fields: {
id: { name: 'id', type: 'string', required: true },
name: { name: 'name', type: 'string', required: true },
price: { name: 'price', type: 'number', required: true, constraints: [{ name: 'positivePrice', predicate: 'min', parameters: 0 }] },
category: { name: 'category', type: 'string', required: false, default: 'General' },
inStock: { name: 'inStock', type: 'boolean', required: true }
},
nestedSchemas: {}
};
// Define custom predicates (validation functions)
const customPredicates = {
min: ({ data, field, arguments: minValue }: { data: any, field: string, arguments: number }) => {
return data[field] >= minValue;
},
// Add other predicates as needed for your constraints
};
const productValidator = createStandardSchemaValidator(productSchema, customPredicates)['~standard'];
// Valid data
const validProduct = {
id: 'prod123',
name: 'Laptop Pro',
price: 1200.50,
inStock: true
};
const validationResult1 = productValidator.validate(validProduct);
console.log('Valid product validation:', validationResult1); // { value: {...} }
// Invalid data (missing required field)
const invalidProduct1 = {
id: 'prod124',
price: 500,
inStock: false
};
const validationResult2 = productValidator.validate(invalidProduct1);
console.log('Invalid product (missing name) validation:', validationResult2);
// { issues: [{ message: "Field 'name' is required", path: ["name"] }] }
// Invalid data (price constraint violation)
const invalidProduct2 = {
id: 'prod125',
name: 'Headphones',
price: -10, // Invalid price
inStock: true
};
const validationResult3 = productValidator.validate(invalidProduct2);
console.log('Invalid product (price constraint) validation:', validationResult3);
// { issues: [{ message: "Constraint 'positivePrice' failed for field 'price' with params 0", path: ["price", "constraints[0]"] }] }Developer Tools
Anansi includes utilities to assist developers in building and documenting their applications.
Generating TypeScript Types
schemaToTypes generates TypeScript type definitions from your schema, providing strong typing for your application's data models.
import { schemaToTypes, type SchemaDefinition } from '@asaidimu/anansi';
const mySchema: SchemaDefinition = {
name: 'BlogPost',
version: '1.0.0',
description: 'A blog post entry',
fields: {
id: { name: 'id', type: 'string', required: true },
title: { name: 'title', type: 'string', required: true, description: 'The title of the blog post' },
author: { name: 'author', type: 'string', required: false, deprecated: true },
content: { name: 'content', type: 'string' },
status: { name: 'status', type: 'enum', values: ['draft', 'published', 'archived'] },
metadata: { name: 'metadata', type: 'record', required: false, description: 'Arbitrary key-value metadata' },
tags: { name: 'tags', type: 'array', itemsType: 'string', default: [] },
comments: { name: 'comments', type: 'object', schema: { id: 'Comment' }, required: false }
},
nestedSchemas: {
Comment: {
name: 'Comment',
fields: {
commentId: { name: 'commentId', type: 'string', required: true },
text: { name: 'text', type: 'string', required: true },
authorEmail: { name: 'authorEmail', type: 'string', required: true }
}
}
},
indexes: [
{ name: 'titleIndex', fields: ['title'], type: 'normal' }
]
};
const generatedTypes = schemaToTypes(mySchema, true, true);
console.log(generatedTypes);
/* Expected Output (simplified):
export type Comment = {
commentId: string;
text: string;
authorEmail: string;
};
export type BlogPostStatus = "draft" | "published" | "archived";
export type BlogPost<Metadata extends Record<string, any> = Record<string, any>> = {
id: string;
title: string;
... rest of fields and types ...
author?: string;
metadata?: Metadata;
tags?: string[];
comments?: string | Comment;
};
export enum BlogPostIndexNames {
titleIndex = "titleIndex",
}
*/Generating Markdown Documentation
docgen creates a human-readable Markdown document describing your schema.
import { docgen, type SchemaDefinition } from '@asaidimu/anansi';
import { faker } from '@faker-js/faker';
const docSchema: SchemaDefinition = {
name: 'Customer',
version: '1.0.0',
description: 'Detailed profile for a customer.',
fields: {
customerId: { name: 'customerId', type: 'string', required: true, unique: true, description: 'Unique identifier for the customer.' },
fullName: { name: 'fullName', type: 'string', required: true, description: 'Full name of the customer.' },
tier: { name: 'tier', type: 'enum', values: ['Bronze', 'Silver', 'Gold'], default: 'Bronze', description: 'Customer loyalty tier.' },
lastPurchaseDate: { name: 'lastPurchaseDate', type: 'string', required: false, description: 'Date of the last purchase (ISO string).' }
},
nestedSchemas: {},
mock: (fakerInstance) => ({
customerId: fakerInstance.string.uuid(),
fullName: fakerInstance.person.fullName(),
tier: fakerInstance.helpers.arrayElement(['Bronze', 'Silver', 'Gold']),
lastPurchaseDate: fakerInstance.date.past().toISOString()
})
};
const markdownDoc = docgen(docSchema, { faker });
console.log(markdownDoc);
/* Expected Output (partial):
# Customer Schema (Version 1.0.0)
Detailed profile for a customer.
## Metadata
- **Dependencies:** None
- **Created:** 2024-XX-XXTXX:XX:XXZ
## Fields
| Name | Type | Required | Default | Description | Deprecated | Unique | Constraints |
|------|------|----------|---------|-------------|------------|--------|-------------|
| customerId | string | Yes | `None` | Unique identifier for the customer. | No | Yes | 0 |
| fullName | string | Yes | `None` | Full name of the customer. | No | No | 0 |
| tier | enum | No | `"Bronze"` | Customer loyalty tier. | No | No | 0 |
| lastPurchaseDate | string | No | `None` | Date of the last purchase (ISO string). | No | No | 0 |
## Indexes
...
*/Project Architecture
Anansi's architecture is modular, promoting separation of concerns and extensibility.
Core Components
src/types: Defines the foundational data structures and interfaces for Anansi. This is the "language" of your data models and how various parts of the system interact.src/lib: Contains the core implementations of Anansi's main functionalities:- Persistence: Manages data operations (CRUD, events, tasks) against a specific backend.
EphemeralCollectionprovides an in-memory implementation for quick starts. - Registry: Provides a mechanism for storing and retrieving schema definitions and their histories.
SchemaRegistryhandles local filesystem storage, whilecreateGitSchemaRegistryintegrates with Git for distributed version control. - Migration: Orchestrates schema evolution, applying changes and transforming data between versions.
- Persistence: Manages data operations (CRUD, events, tasks) against a specific backend.
src/sdk: Houses pluggable Software Development Kit components, such as specific persistence adapters (e.g., PocketBase) and code generators (e.g., static TypeScript validators).src/tools: A collection of cross-cutting utility functions that support various aspects of Anansi, including cryptographic hashing, JSON Patching, type generation, and advanced runtime validation.
Extension Points
Anansi is designed for extensibility:
- Persistence Adapters: Implement the
PersistenceandPersistenceCollectioninterfaces to connect Anansi to any database or storage solution (e.g., SQL, NoSQL, GraphQL endpoints). - Custom Predicates: Extend the validation system by defining your own
Predicatefunctions and including them in thePredicateMapwhen initializing validators. - Schema Changes & Transforms: The
SchemaMigrationHelperandDataTransformtypes provide a powerful DSL to define custom schema evolutions and data transformations for complex migrations. - Remote Repository Adapters: Implement the
RemoteRepositoryinterface to integrate theSchemaRegistrywith other Git hosting services beyond GitHub and Gitea.
Development & Contributing
We welcome contributions to Anansi! Here's how you can get started:
Local Development Setup
- Clone the repository:
git clone https://github.com/asaidimu/data-model.git anansi cd anansi - Install dependencies:
(orbun installnpm install/yarn install) - Build the project:
bun run build
Available Scripts
bun run ci: Installs dependencies (for CI environments).bun run clean: Removes thedist/directory.bun run prebuild: Cleans thedist/directory and runs.sync-package.ts(internal synchronization).bun run build: Compiles TypeScript source files intodist/for CommonJS and ES Modules, generates declaration files (.d.ts), and minifies output.bun run build:watch: Runs the build process in watch mode for continuous compilation during development.bun run postbuild: CopiesREADME.md,LICENSE.md, anddist.package.jsoninto thedist/directory.bun run test: Runs all unit tests using Vitest.bun run test:ci: Runs Vitest tests once for CI environments.bun run test:debug: Runs Vitest in debug mode, useful for debugging tests in an IDE.bun run docs:dev: Starts the VitePress development server for the documentation.bun run docs:build: Builds the static VitePress documentation site.bun run ui:dev: Starts a local Vite development server for the UI (likely a demo or internal tooling).bun run docs:preview: Previews the built VitePress documentation.
Testing
Anansi uses Vitest for its test suite.
- To run all tests:
bun run test - The tests are configured to run in different environments (Node.js and browser via
happy-dom/playwright) to ensure compatibility. - Test coverage can be generated by running
bun test --coverage.
Contributing Guidelines
We welcome contributions! Please follow these guidelines:
- Fork the repository and clone your fork.
- Create a new branch for your feature or bug fix:
git checkout -b feature/my-new-featureorbugfix/fix-that-bug. - Ensure your code adheres to existing coding styles (ESLint, Prettier are used internally).
- Write tests for your changes.
- Ensure all tests pass (
bun run test). - Commit your changes with clear, concise messages following Conventional Commits (e.g.,
feat: add new feature,fix: resolve bug). - Push your branch and create a Pull Request to the
mainbranch.
Issue Reporting
- Bug Reports: If you find a bug, please open an issue on the GitHub Issues page. Provide a clear description, steps to reproduce, expected behavior, and your environment details.
- Feature Requests: For new features or enhancements, open an issue with a detailed explanation of the proposed functionality and its use case.
Additional Information
Troubleshooting
- "Buffer is not defined": If you encounter this error in a browser environment, ensure that
window.Buffer = Buffer;is included in your entry point, asLightningFSandisomorphic-gitmight rely on it. Anansi'sregistry.tsalready includes this for convenience. - CORS Issues with Git Remote: When using
createGitSchemaRegistryin a browser, you might hit CORS restrictions. Utilize theproxyoption with a CORS proxy URL (e.g.,https://cors.isomorphic-git.org) in the factory function. - Migration Checksum Mismatch: If you modify a migration file after it has been added to the registry, its checksum will no longer match, leading to an error. Always generate new migrations for changes or ensure your migration logic is stable.
Changelog & Roadmap
Stay up-to-date with the latest changes and future plans:
- Changelog: Refer to the CHANGELOG.md for a detailed history of releases and breaking changes.
- Roadmap: Future development plans are typically tracked via GitHub issues and project boards.
License
Anansi is open-source software licensed under the MIT License. You can find the full text in the LICENSE.md file.
Acknowledgments
Anansi builds upon the shoulders of giants. We'd like to acknowledge the following projects and libraries that make Anansi possible:
- isomorphic-git: For bringing Git to JavaScript environments.
- @isomorphic-git/lightning-fs: A super-fast in-memory filesystem.
- PocketBase: For a delightful backend experience.
- @faker-js/faker: For robust mock data generation.
- @standard-schema/spec: For a standardized schema validation interface.
- Bun: For an incredibly fast JavaScript runtime and toolkit.
- Vitest: For a blazing-fast unit test framework.
- VitePress: For beautiful and fast documentation.
