npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@flowcore/sdk-data-pump-client

v1.3.0

Published

A lightweight client for fetching events from the Flowcore platform

Downloads

34

Readme

Flowcore SDK Module - Data Pump Client

A Flowcore SDK module that provides a lightweight client for fetching events from the Flowcore platform

Installation

install with npm:

npm install @flowcore/sdk-data-pump-client @flowcore/sdk-oidc-client

or yarn:

yarn add @flowcore/sdk-data-pump-client @flowcore/sdk-oidc-client

Usage

Create a new instance of the Data Pump client:

import {DataPump} from '@flowcore/sdk-data-pump-client';
import {OidcClient} from "@flowcore/sdk-oidc-client";

const client = new OidcClient("your client id", "your client secret", "well known endpoint");

const dataPump = new DataPump("https://graph.api.flowcore.io/graphql", client);

You can configure the page size in the last argument of the constructor, the default is 1000.

Then create a RXJS observable to listen to the events:

import {Subject} from 'rxjs';
import {SourceEvent} from "@flowcore/sdk-data-pump-client";
import {subscribe} from "graphql/execution";

const subject = new Subject<SourceEvent>();

subject.subscribe({
  next: (event) => {
    console.log(event);
  },
  complete: () => {
    console.log("completed");
  },
});

Then you can fetch all events with the fetchAllEvents method:

await dataPump.fetchAllEvents(subject, {
  dataCoreId: "your data core id",
  aggregator: "your aggregator",
  eventTypes: ["your event type"],
});

This will loop through all the events for the specified event types and push them to the observable.

You can specify how many time buckets should be run in parallel with the parallel argument, the default is 1.

To fetch events for a specific time bucket you can use the fetchEvents method:

await dataPump.fetchEvents(subject, {
  dataCoreId: "your data core id",
  aggregator: "your aggregator",
  eventTypes: ["your event type"],
  timeBucket: "your time bucket",
});

This will fetch all events for the specified time bucket and push them to the observable.

To close the observable set the last argument of the fetchEvents method to true.

Limits

You can specify the from and to event id to fetch events between a specific range:

await dataPump.fetchEvents(subject, {
  dataCoreId: "your data core id",
  aggregator: "your aggregator",
  eventTypes: ["your event type"],
  timeBucket: "your time bucket",
  afterEventId: "your from event id",
  beforeEventId: "your to event id",
});

These are both exclusive, meaning that the events with the specified id's will not be included in the result. Either and both can be omitted.

Indexes

You can fetch time buckets for a specific event type with the fetchIndexes method:

await dataPump.fetchIndexes({
  dataCoreId: "your data core id",
  aggregator: "your aggregator",
  eventType: "your event type",
});

This will return a list of time buckets for the specified event type.

You can also specify the from and to time bucket with the from and to arguments. and it will return the time buckets between the specified range.

Pumping Events

You can also pump events to a destination with the pumpEvents method:


const abortController = new AbortController();

await dataPump.pumpEvents(
  "cache-key",
  "observable",
  {
    dataCoreId: "your data core id",
    aggregator: "your aggregator",
    eventTypes: ["your event type"],
  },
  abortController
);

this will pump all events using backfilling, then switch to live mode and pump all new events. The cache-key is used to store the last event id in the cache, so that the client can resume from the last event id if it is restarted. The abort controller can be used to stop the pumping.

Note: the default cache is in memory, you can implement your own cache by extending the SimpleCache class and passing it to the DataPump constructor. via the options object.

Note: You can also specify from time bucket with the from argument and control the backfilling parallelism with the parallel argument.

Note: When not passing the abort controller, the data pump will run once until it has fetched all events currently present in the data container.

Reseting the data pump

You can reset the data pump with the reset method:

await dataPump.reset("cache-key");

Note: it is only possible to reset the data pump if it is not currently running. You can check if the data pump is running with the isRunning method. To stop the data pump you can use the abort controller.

Creating your own pump

You can create your own pump by manually calling the pumpPage method:

let cursor: string | undefined = undefined;
do {
  const result = await dataPump.pumpPage({
    dataCoreId: "your data core id",
    aggregator: "your aggregator",
    eventTypes: ["your event type"],
    timeBucket: "your time bucket",
    afterEventId: "your from event id" | undefined,
    beforeEventId: "your to event id" | undefined,
    cursor: "your cursor" | undefined,
  });

  for (const event of result.events) {
    console.log(event);
  }
  cursor = result.cursor;
} while(cursor);

This will allow you to control the flow of events and prevent the dump from pumping too many events at once.

Development

yarn install

or with npm:

npm install