npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

dataminer

v0.1.5

Published

Performing common data mining tasks such as parallel download jobs, importing data, downloading files from s3, creating new jobs on the fly

Downloads

9

Readme

Dataminer

Dataminer is a data mining, fault-tolerant distributed worker queue built for creating job workers simple and quickly. Common routines for processing queues, streaming data into other queues or distributing queues, flush routines into other data sources are all apart of dataminer's core tasks. Dataminer is backed by redis and mongodb.

$ npm install dataminer

Features


  • fault-tolerant queues
  • worker monitoring
  • queue events and progress
  • worker specific logging
  • powered by Redis & MongoDB
  • restful json api
  • fault tolerant streams
  • exponential and linear backoff

Creating Workers


First to create any standard queue worker that processes items off of a redis queue, use dataminer.createQueue.

dataminer.createQueue(queueName, n, options):

Create a new queue worker to process on queueName. n is the number of calls that will be done at any given time for each job (n = 2 will process 2 jobs at a time for a single process). n defaults to 1. options are extended properties for the worker that include:

  • name: A friendly name for a given queue worker (defaults to queueName provided from parameters.
  • redis: Redis related options for processing the queue and optionally reporting on the status of the worker.
    • host: defaults to localhost.
    • port: defaults to 6379.
    • options: redis options to the redis client.
    • auth: optional auth parameters to pass to the redis client.
  • report: defaults to true to report on the status of the worker once every reportInterval milliseconds.
  • reportIntervalMs: defaults to 1000
  • progress: defaults to false to update progress with a traceable job id.
var dataminer = require('dataminer'),
    request = require('request');

var downloader = dataminer.createQueue('q-urls');
downloader.process(function (job, done) {

    var contentLength = 0;
    var req = request(job.data.url);
    req.pipe(fs.createWriteStream(job.data.path));
    req.on('response', function (response) {
        contentLength = response['content-length'];
    });
    req.on('data', function (chunk) {
        job.progress(chunk.length, contentLength);
    });
    req.on('error', function (err) {
        done(err);
    });
    req.on('end', function () {
        done();
    }):

});

TODO:


  • Add streaming to queue support ala dataminer.createStream
    • Add Twitter Sample Stream Example (use request.pipe)
  • Add flush routine support to existing queue workers
    • queueWorker.flush(flushIntervalMs, function () { })
  • Add dataminer worker registration
    • dataminer.register(worker, options) (for extended worker support that includes registered id and will live in state on even after the process dies). This is especially useful for when workers have died and it is important to know that from a dashboard.
  • Add logging support (injection for different types)

LICENSE:


(The MIT License)

Copyright (c) 2012 [email protected]

Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the 'Software'), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.