npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

protocol-buffers

v5.0.0

Published

Protocol Buffers for Node.js

Downloads

114,822

Readme

protocol-buffers

Protocol Buffers for Node.js

npm install protocol-buffers

build status dat

Usage

Assuming the following test.proto file exists

enum FOO {
  BAR = 1;
}

message Test {
  required float num  = 1;
  required string payload = 2;
}

message AnotherOne {
  repeated FOO list = 1;
}

Use the above proto file to encode/decode messages by doing

var protobuf = require('protocol-buffers')

// pass a proto file as a buffer/string or pass a parsed protobuf-schema object
var messages = protobuf(fs.readFileSync('test.proto'))

var buf = messages.Test.encode({
  num: 42,
  payload: 'hello world'
})

console.log(buf) // should print a buffer

To decode a message use Test.decode

var obj = messages.Test.decode(buf)
console.log(obj) // should print an object similar to above

Enums are accessed in the same way as messages

var buf = messages.AnotherOne.encode({
  list: [
    messages.FOO.BAR
  ]
})

Nested emums are accessed as properties on the corresponding message

var buf = message.SomeMessage.encode({
  list: [
    messages.SomeMessage.NESTED_ENUM.VALUE
  ]
})

See the Google Protocol Buffers docs for more information about the available types etc.

Compile to a file

Since v4 you can now compile your schemas to a JavaScript file you can require from Node. This means you do not have runtime parse the schemas, which is useful if using in the browser or on embedded devices. It also makes the dependency footprint a lot smaller.

# first install the cli tool
npm install -g protocol-buffers

# compile the schema
protocol-buffers test.proto -o messages.js

# then install the runtime dependency in the project
npm install --save protocol-buffers-encodings

That's it! Then in your application you can simply do

var messages = require('./messages')

var buf = messages.Test.encode({
  num: 42
})

The compilation functionality is also available as a JavaScript API for programmatic use:

var protobuf = require('protocol-buffers')

// protobuf.toJS() takes the same arguments as protobuf()
var js = protobuf.toJS(fs.readFileSync('test.proto'))
fs.writeFileSync('messages.js', js)

Imports

The cli tool supports protocol buffer imports by default.

Currently all imports are treated as public and the public/weak keywords not supported.

To use it programmatically you need to pass-in a filename & a resolveImport hooks:

var protobuf = require('protocol-buffers')
var messages = protobuf(null, {
  filename: 'initial.proto',
  resolveImport (filename) {
    // can return a Buffer, String or Schema
  }
})

Performance

This module is fast.

It uses code generation to build as fast as possible encoders/decoders for the protobuf schema. You can run the benchmarks yourself by doing npm run bench.

On my Macbook Air it gives the following results

Benchmarking JSON (baseline)
  Running object encoding benchmark...
  Encoded 1000000 objects in 2142 ms (466853 enc/s)

  Running object decoding benchmark...
  Decoded 1000000 objects in 970 ms (1030928 dec/s)

  Running object encoding+decoding benchmark...
  Encoded+decoded 1000000 objects in 3131 ms (319387 enc+dec/s)

Benchmarking protocol-buffers
  Running object encoding benchmark...
  Encoded 1000000 objects in 2089 ms (478698 enc/s)

  Running object decoding benchmark...
  Decoded 1000000 objects in 735 ms (1360544 dec/s)

  Running object encoding+decoding benchmark...
  Encoded+decoded 1000000 objects in 2826 ms (353857 enc+dec/s)

Note that JSON parsing/serialization in node is a native function that is really fast.

Leveldb encoding compatibility

Compiled protocol buffers messages are valid levelup encodings. This means you can pass them as valueEncoding and keyEncoding.

var level = require('level')
var db = level('db')

db.put('hello', {payload:'world'}, {valueEncoding:messages.Test}, function(err) {
  db.get('hello', {valueEncoding:messages.Test}, function(err, message) {
    console.log(message)
  })
})

License

MIT