npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@mediagoom/opflow

v0.0.31

Published

opflow is an operation flow bus. It allow to schedule and run in a reliable way Software operations.

Downloads

4

Readme

Build Status Win Build Status Coverage Status Language grade: JavaScript dependencies

opflow

opflow is a operation flow framework for nodejs. Its main aim is to provide a series of operations described in a json flow. opflow will take the described flow and run it in a reliable way.

Flow Control

opflow, at the moment, provide the following flow control operators:

  • START: should always be the first operation in the flow

  • END: should always be the last operation in the flow

  • IF (coming soon): will split the flow in two branch and run only one of them

  • JOIN: this allow to join different branches in your flow. Join is the only operations which can have more than one parent.

                  START
                  /   \
                 /     \
                /       \
               /         \
              /           \
            DO            DO
          SOMETHING    SOMETHING 
              \         ELSE IN 
               \        PARALLEL
                \         /
                 \       /
                  \     /
                   \   /
                   JOIN
                    |
                    |
                   END

Why opflow

A lot of time when programming you need to keep several operations together to have consistency. In a classic monolithic world you would ACID transaction in order to achieve consistency. Fast forward several years and you land in a Microservices world where you need hyper scalability. To reach this level of scalability you no longer can rely on a central transactional database. Instead you have to segregate your data and operations and rely on eventually consistent. opflow let you describe all your operation and run them with retries in case of failure and storage recovery in case of fault. The simplest example would be a system which manage orders and clients with different microservices. When an Order arrives for a new Client the system need to create both the client and the order. If your code handle this in memory and the code get interrupted without completing either one of the operations the system will be forever inconsistent. The failure could simple derive from an hardware failure, does not have to be a software fault.

In the Client and Order above example opflow would try several times (you say how many) both operations. It would track everything in a storage. If the system should shutdown when it comes backup opflow will resume trying.

How to use opflow

Install

npm i @mediagoom/opflow

Write Code


const opflow = require('@mediagoom/opflow');

opflow.start()

const flow_id = await opflow.add_flow(your_flow);

Unit Testing

When you use opflow you may want to clearly divide the testing of your flow logic from the real flow running. Since opflow is specifically design to isolate your code from external problem you should run the full flow in integration testing.

In order to unit test the logic and design of your flow you can write the json structure of your flow and assign as operation type one of code, echo or NULL.

For instance let say you have a read_file operation. When full running the application you would define the operations as:

...
  children : { 
    type : 'read_file', name : 'read a file'
    , config : {
        path : '<file path to read>'
        }
    , children : ... 
  }

In unit test you may define your operation as:

...
  children : { 
    type : 'code', name : 'read a file'
    , config : {
        path : '<file path to read>'
        , code : '"<the mock of your file content>"'
      } 
    , children : ... 
  }

With your read_file operation, mocked as above, you can define your unit test as:

const chai   = require('chai');
const flows  = require('<your unit test flows definitions');

const expect = chai.expect;
const unitTest = require('@mediagoom/opflow/unit-test');

describe('UNIT-TEST TESTING', () => {
    
    const keys = Object.keys(flows);

    for(let idx = 0; idx < keys.length; idx++)
    {
        const key = keys[idx];

        it('should run unit test for flow ' + key , async () => {
            
            const operations = await unitTest(flows[key]);

            const end = operations.find ( (el) => {return el.type === 'END';} );

            expect(end.completed).to.be.true;
        });
    }
});

In case your unit test should fail you can retrieve the offending operations like this:

try{
........
const operations = await unitTest(flows[key]);
........  
}catch(err)
{
    if(undefined !== err.runtime)
    {
        const failed = err.runtime.find ( (el) => { return (!el.completed && (el.history.length > 0));});
        dbg('failed: %O', failed);
        throw err;
    }
}

Integration Testing

For integration testing just switch the type of the above operation from code to read_file.

opflow Operations

Builtin Operations

Adding New Operations

How to scale opflow

Projects using opflow