npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@blureffect/auto-flow

v0.1.1

Published

A TypeScript library for creating and executing time-based or event-based automations

Downloads

6

Readme

AutoFlow

A TypeScript library for creating and executing time-based or event-based automations with BullMQ workers.

Features

  • 🕒 Time-based Scheduling: Create automations that run on schedules using RRULE strings
  • 🎯 Event-driven Flows: Trigger automations based on webhooks or in-app events
  • 🔄 Composable Steps: Build multi-step flows with a fluent API
  • 📦 Pluggable Storage: Swap storage backends with the repository pattern
  • 🚀 Scalable Architecture: Stateless workers + Redis queues
  • 📊 Observability: Built-in logging, metrics, and OpenTelemetry integration
  • 💾 Multiple Adapters: PostgreSQL, In-Memory, and ORM adapters (TypeORM, Sequelize, Prisma)
  • RRULE Validation: Comprehensive validation for schedule strings to prevent runtime errors
  • 🛡️ Input Validation: Built-in sanitization and validation for all automation inputs
  • 🚦 Rate Limiting: Configurable rate limits for automation execution and API polling

Installation

npm install auto-flow
# or
yarn add auto-flow

Examples

AutoFlow comes with ready-to-run examples to help you understand how to use the library.

Running Examples

Use the provided npm scripts to run the examples:

# Show available examples and options
npm run examples

# Run the in-memory example
npm run example:in-memory

# Run the TypeORM example
npm run example:typeorm

# Run the in-memory example and start required Docker containers
npm run example:docker

Alternatively, you can run examples directly with ts-node:

# Run with custom options
ts-node examples/run.ts in-memory --docker --debug

Available Examples

  1. In-Memory Example: Demonstrates time-based and event-based automations using the in-memory repository. Requires Redis for queuing.

  2. TypeORM Example: Shows how to use the TypeORM adapter with PostgreSQL for persistent storage. Requires PostgreSQL and Redis.

Example Configuration

The examples use a shared configuration system:

  1. Default values are provided for all settings
  2. Environment variables can override defaults
  3. Use the --debug flag for more verbose logging

Quick Start

1. Create a Time-based Automation

import { FlowBuilder } from 'auto-flow';

const flow = new FlowBuilder(
  { kind: 'schedule', rrule: 'RRULE:FREQ=HOURLY;INTERVAL=2' },
  'user-123'
)
  .name('Email Summary')
  .description('Fetch and summarize emails every 2 hours')
  .tags('email', 'summary')
  .step('fetch-emails', 'email.fetch', { inbox: 'INBOX' })
  .step('summarize', 'ai.summarize', { model: 'gpt-4' })
  .step('send-slack', 'slack.notify', { channel: '#me' });

await flow.save(repository);

2. Create an Event-driven Automation

const flow = new FlowBuilder(
  { kind: 'event', eventName: 'new.signup' },
  'user-123'
)
  .name('Welcome Flow')
  .step('send-email', 'email.send', {
    template: 'welcome',
    delay: '1h'
  })
  .step('create-task', 'crm.task', {
    title: 'Follow up with new signup',
    assignTo: 'sales-team'
  });

await flow.save(repository);

Core Components

FlowBuilder

Fluent API for creating automations:

const flow = new FlowBuilder(trigger, userId)
  .name(name)
  .description(desc)
  .tags(...tags)
  .step(name, jobName, input)
  .save(repository);

ScheduleManager

Handles time-based automations:

const scheduler = new ScheduleManager(repository, redisConfig);
await scheduler.start();

EventBus

Manages event-driven automations:

const eventBus = new EventBus(repository, queue);

// Publish events
await eventBus.publish({
  name: 'user.signup',
  payload: { userId: '123' }
});

// Subscribe to events
eventBus.subscribe('user.signup').subscribe(event => {
  console.log('New signup:', event);
});

FlowExecutor

Processes automation steps:

const executor = new FlowExecutor(repository, redisConfig);

// Register job handlers
executor.registerJobHandler('email.send', async (step, context) => {
  // Handle email sending
});

executor.registerJobHandler('slack.notify', async (step, context) => {
  // Handle Slack notifications
});

Storage Adapters

AutoFlow provides several storage adapters for different environments and database systems.

PostgreSQL Adapter

Direct PostgreSQL adapter for production use:

import { PostgresAutomationRepository } from 'auto-flow';
import { Pool } from 'pg';

const pool = new Pool({
  host: 'localhost',
  port: 5432,
  user: 'postgres',
  password: 'postgres',
  database: 'auto_flow'
});

const repository = new PostgresAutomationRepository(pool);

In-Memory Adapter

Perfect for testing and development:

import { InMemoryAutomationRepository } from 'auto-flow';

const repository = new InMemoryAutomationRepository();

// Use in tests
beforeEach(() => {
  repository.clear(); // Reset the repository between tests
});

ORM Adapters

TypeORM

import { 
  TypeOrmAutomationRepository, 
  AutomationEntity, 
  AutomationRunEntity 
} from 'auto-flow';
import { DataSource } from 'typeorm';

// Set up TypeORM data source
const dataSource = new DataSource({
  type: 'postgres',
  host: 'localhost',
  port: 5432,
  username: 'postgres',
  password: 'postgres',
  database: 'auto_flow',
  entities: [AutomationEntity, AutomationRunEntity],
  synchronize: true, // For development only
});

await dataSource.initialize();

// Create the repository
const repository = new TypeOrmAutomationRepository(dataSource);

Creating Custom Adapters

Implement the IAutomationRepository interface:

import { IAutomationRepository } from 'auto-flow';

class MyCustomRepository implements IAutomationRepository {
  // Implement all required methods
}

Or extend the ORM base class:

import { OrmAutomationRepositoryBase } from 'auto-flow';

class MyCustomOrmRepository extends OrmAutomationRepositoryBase {
  // Implement abstract methods
}

Configuration

Redis Connection

const redisConfig = {
  host: process.env.REDIS_HOST || 'localhost',
  port: parseInt(process.env.REDIS_PORT || '6379')
};

Logging

import { logger } from 'auto-flow';

// Set log level
logger.level = 'debug';

// Enable OpenTelemetry
process.env.ENABLE_TELEMETRY = 'true';

Best Practices

  1. Error Handling: Add retry configurations for critical steps

    .step('api-call', 'http.post', data, {
      retry: {
        attempts: 3,
        backoff: { type: 'exponential', delay: 1000 }
      }
    })
  2. RRULE Validation: Validate schedule strings before saving

    import { InputValidator } from 'auto-flow';
       
    // Validate RRULE strings
    const validation = InputValidator.validateRRule('RRULE:FREQ=DAILY');
    if (!validation.valid) {
      console.error(`Invalid RRULE: ${validation.error}`);
    }
       
    // FlowBuilder automatically validates RRULEs
    const flow = new FlowBuilder(
      { kind: 'schedule', rrule: 'RRULE:FREQ=HOURLY' }, // Will be validated
      'user-123'
    );
  3. Monitoring: Use the built-in metrics

    const stats = await repository.getAutomationStats(automationId);
    console.log('Success rate:', stats.successfulRuns / stats.totalRuns);
  4. Scaling: Run multiple workers for high availability

    new FlowExecutor(repository, redisConfig, {
      concurrency: 5
    });
  5. Testing: Use the in-memory adapter for unit tests

    import { InMemoryAutomationRepository } from 'auto-flow';
       
    describe('My automation tests', () => {
      const repository = new InMemoryAutomationRepository();
         
      beforeEach(() => {
        repository.clear();
      });
         
      it('should execute my automation', async () => {
        // Test code here
      });
    });

Contributing

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

ISC

Support