cachegator
v0.0.15
Published
a split-able caching aggregation handler
Maintainers
Readme
cachegator
a split-able streaming aggregation wrapper around mongoose/mongodb cursor
It limit per ~16MB (BSONObj limit) caching processing with $facet/$literal
usage (TBD)
const ctor = new CacheGator({
useRedis: false, // in memory caching
model: MongooseModel,
debug: true,
});
ctor.setSplitter(mySplitterFunction);
const opt = {
startDate,
endDate,
applicationId: ["1", "3", "4"],
type: "dataset",
};
ctor.split(opt);
...splitter function set.
pre-loading dataset chunk 1/33...
pre-loading dataset chunk 2/33...
pre-loading dataset chunk 3/33...
pre-loading dataset chunk 4/33...
pre-loading dataset chunk 5/33...
pre-loading dataset chunk 6/33...
pre-loading dataset chunk 7/33...
pre-loading dataset chunk 8/33...
pre-loading dataset chunk 9/33...
pre-loading dataset chunk 10/33...
pre-loading dataset chunk 11/33...
pre-loading dataset chunk 12/33...
pre-loading dataset chunk 13/33...
pre-loading dataset chunk 14/33...
pre-loading dataset chunk 15/33...
pre-loading dataset chunk 16/33...
pre-loading dataset chunk 17/33...
pre-loading dataset chunk 18/33...
pre-loading dataset chunk 19/33...
pre-processing dataset chunk 20/33...
pre-processing dataset chunk 21/33...
pre-processing dataset chunk 22/33...
pre-processing dataset chunk 23/33...
pre-processing dataset chunk 24/33...
pre-processing dataset chunk 25/33...
pre-processing dataset chunk 26/33...
pre-processing dataset chunk 27/33...
pre-processing dataset chunk 28/33...
pre-processing dataset chunk 29/33...
pre-processing dataset chunk 30/33...
pre-processing dataset chunk 31/33...
pre-processing dataset chunk 32/33...
pre-processing dataset chunk 33/33...
processing batch 1 :: 10000 records processed...
processing batch 2 :: 20000 records processed...
processing batch 3 :: 30000 records processed...
processing batch 4 :: 40000 records processed...
processing batch 5 :: 50000 records processed...
processing batch 6 :: 60000 records processed...
processing batch 7 :: 70000 records processed...
processing batch 8 :: 80000 records processed...
processing batch 9 :: 90000 records processed...
processing batch 10 :: 100000 records processed...
processing batch 11 :: 110000 records processed...
processing batch 12 :: 120000 records processed...
processing batch 13 :: 130000 records processed...
38 combined entries processed...See test folder for usage, you can increase sample generation to 10M records and see how it perform.
