npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

json-stream-compression

v0.0.2

Published

JSON is great format to exchange data, but it isn't really efficient when it comes to data volume. This module tries to reduce the data size of each message by caching object keys and certain string values. Further it utilizes [CBOR](http://cbor.io/) to c

Downloads

27

Readme

JSON-Stream-Compression

JSON is great format to exchange data, but it isn't really efficient when it comes to data volume. This module tries to reduce the data size of each message by caching object keys and certain string values. Further it utilizes CBOR to convert objects into compact byte buffers.

This solution is good for data with a repetitive structure that you can't predict ahead of time. There might be better solutions if your data has a fixed structure (e.g.: Protobuf or Avro)

Installation

npm install json-stream-compression --save

Requirements

  • NodeJS 4.x or higher

Usage

Server

const WebSocketServer = require('ws').Server
const wss = new WebSocketServer({port: 8083}, function() {
  console.log('WS server is up and running')
})

const Decoder = require('../').Decoder

wss.on('connection', function connection(ws) {
  // creates a new decoder for each connection
  const decoder = new Decoder()
  ws.on('message', function incoming(message) {
    message = decoder.decode(message)
    console.log(message)
  })
})

Client

const WebSocket = require('ws')
const ws = new WebSocket('ws://localhost:8083')

const Encoder = require('../').Encoder
const sampleData = [
  {device: '878f4dfb-538c-4736-a555-25d7414fcb96', location: 'kitchen', type: 'tempreature', unit: 'C', value: 25.4},
  {device: 'cc51a2eb-912d-4233-b78f-d447794947a0', location: 'office', type: 'noise', unit: 'db', value: 92},
  {device: '878f4dfb-538c-4736-a555-25d7414fcb96', location: 'kitchen', type: 'tempreature', unit: 'C', value: 25.4},
]

ws.on('open', function open() {
  // creates an encoder if the connection gets established
  const encoder = new Encoder()
  sampleData.forEach((d) => ws.send(encoder.encode(d)))
})

Note: The encoder and decoder have a synced state! This means messages must be decoded in the same order as they are encoded and loss of messages is not acceptable. In example the message transfer is done via WebSockets which are based on TCP. TCP guarantees exactly-once delivery and in-order delivery. But in any case if the connection breaks the encoder and decoder should be re-instantiated.

Limitations

  • Only n to 1 messaging, a decoder instance can only handle messages from a single encoder instance
  • Only works if message delivery is ensured and messages arrive in order
  • Also keep in mind there are minimum payload sizes (e.g.: 46 Byte for TCP), so it might be not beneficial not make small messages smaller
  • If data doesn't have to be transferred in realtime, it could make more sense to batch them and use something like GZIP for compression