npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@procwire/codec-protobuf

v0.2.2

Published

Protocol Buffers codec for @procwire/transport.

Downloads

592

Readme

@procwire/codec-protobuf

Protocol Buffers serialization codec for @procwire/transport.

Provides type-safe binary serialization using protobufjs with schema validation, configurable options, and comprehensive error handling.

Features

  • ✅ Type-safe with full TypeScript generics
  • ✅ Schema validation via protobufjs
  • ✅ Configurable Long/enum/bytes conversion
  • ✅ Message verification before encoding
  • ✅ Zero-copy buffer optimization
  • ✅ Helper functions for .proto file loading
  • ✅ Comprehensive error handling

Installation

npm install @procwire/codec-protobuf protobufjs

Note: protobufjs is a peer dependency and must be installed separately.

Quick Start

Basic Usage

import * as protobuf from "protobufjs";
import { ProtobufCodec } from "@procwire/codec-protobuf";

// Define your schema
const root = protobuf.Root.fromJSON({
  nested: {
    User: {
      fields: {
        id: { type: "int32", id: 1 },
        name: { type: "string", id: 2 },
        email: { type: "string", id: 3, rule: "optional" },
      },
    },
  },
});

const UserType = root.lookupType("User");

// Create typed codec
interface User {
  id: number;
  name: string;
  email?: string;
}

const codec = new ProtobufCodec<User>(UserType);

// Serialize
const user = { id: 123, name: "Alice" };
const buffer = codec.serialize(user);

// Deserialize
const decoded = codec.deserialize(buffer);
console.log(decoded); // { id: 123, name: 'Alice' }

Loading from .proto File

import { createCodecFromProto } from "@procwire/codec-protobuf";

interface User {
  id: number;
  name: string;
}

const codec = await createCodecFromProto<User>("./schemas/user.proto", "myapp.User");

const buffer = codec.serialize({ id: 1, name: "Alice" });
const user = codec.deserialize(buffer);

From JSON Schema

import { createCodecFromJSON } from "@procwire/codec-protobuf";

interface User {
  id: number;
  name: string;
}

const codec = createCodecFromJSON<User>(
  {
    nested: {
      User: {
        fields: {
          id: { type: "int32", id: 1 },
          name: { type: "string", id: 2 },
        },
      },
    },
  },
  "User",
);

Configuration Options

ProtobufCodecOptions

| Option | Type | Default | Description | | ------------------- | ------------------ | ----------- | ---------------------------------- | | longs | String \| Number | String | How to convert int64/uint64 values | | enums | String | undefined | Convert enums to string names | | bytes | String \| Array | undefined | Convert bytes fields format | | defaults | boolean | false | Include default values in output | | oneofs | boolean | false | Include oneof field names | | verifyOnSerialize | boolean | true | Verify message before encoding |

Handling Large Integers (int64)

Protocol Buffers int64/uint64 can exceed JavaScript's Number.MAX_SAFE_INTEGER. By default, these are converted to strings to preserve precision:

const codec = new ProtobufCodec<{ timestamp: string }>(TimestampType, {
  longs: String, // Default - safe for large values
});

const output = codec.deserialize(buffer);
console.log(typeof output.timestamp); // 'string'
console.log(output.timestamp); // '9007199254740993'

For small values where precision isn't a concern:

const codec = new ProtobufCodec<{ timestamp: number }>(TimestampType, {
  longs: Number, // May lose precision for large values
});

Enum Conversion

const codec = new ProtobufCodec<{ status: string }>(MessageType, {
  enums: String, // Convert enum values to their names
});

// With enums: String
console.log(output.status); // 'ACTIVE'

// Without (default)
console.log(output.status); // 1

Bytes Field Handling

// Default: Uint8Array
const codec1 = new ProtobufCodec<{ data: Uint8Array }>(MessageType);

// Base64 string
const codec2 = new ProtobufCodec<{ data: string }>(MessageType, {
  bytes: String,
});

// Number array
const codec3 = new ProtobufCodec<{ data: number[] }>(MessageType, {
  bytes: Array,
});

API Reference

ProtobufCodec

Main codec class implementing SerializationCodec<T>.

class ProtobufCodec<T> implements SerializationCodec<T> {
  readonly name: "protobuf";
  readonly contentType: "application/x-protobuf";

  constructor(messageType: Type, options?: ProtobufCodecOptions);

  get type(): Type;

  serialize(value: T): Buffer;
  deserialize(buffer: Buffer): T;
}

Properties

  • name - Always "protobuf"
  • contentType - Always "application/x-protobuf"
  • type - The protobufjs Type instance (getter)

Methods

serialize(value: T): Buffer

Serializes a value to Protocol Buffers binary format.

  • value - Value to serialize (must match message schema)
  • Returns - Buffer containing protobuf-encoded data
  • Throws - SerializationError if verification fails or encoding errors occur
deserialize(buffer: Buffer): T

Deserializes Protocol Buffers binary data.

  • buffer - Buffer or Uint8Array containing protobuf data
  • Returns - Deserialized plain JavaScript object
  • Throws - SerializationError if input is invalid or decoding fails

createCodecFromProto()

Creates a codec by loading a .proto file.

async function createCodecFromProto<T>(
  protoPath: string,
  messageName: string,
  options?: ProtobufCodecOptions,
): Promise<ProtobufCodec<T>>;

createCodecFromJSON()

Creates a codec from an inline JSON schema.

function createCodecFromJSON<T>(
  schema: INamespace,
  messageName: string,
  options?: ProtobufCodecOptions,
): ProtobufCodec<T>;

Advanced Usage

Nested Messages

const root = protobuf.Root.fromJSON({
  nested: {
    Address: {
      fields: {
        street: { type: "string", id: 1 },
        city: { type: "string", id: 2 },
      },
    },
    Person: {
      fields: {
        name: { type: "string", id: 1 },
        address: { type: "Address", id: 2 },
      },
    },
  },
});

interface Address {
  street: string;
  city: string;
}

interface Person {
  name: string;
  address: Address;
}

const codec = new ProtobufCodec<Person>(root.lookupType("Person"));

Repeated Fields

const root = protobuf.Root.fromJSON({
  nested: {
    Message: {
      fields: {
        id: { type: "int32", id: 1 },
        tags: { type: "string", id: 2, rule: "repeated" },
      },
    },
  },
});

interface Message {
  id: number;
  tags: string[];
}

const codec = new ProtobufCodec<Message>(root.lookupType("Message"));

Oneof Fields

const root = protobuf.Root.fromJSON({
  nested: {
    Message: {
      oneofs: {
        value: { oneof: ["stringValue", "intValue"] },
      },
      fields: {
        stringValue: { type: "string", id: 1 },
        intValue: { type: "int32", id: 2 },
      },
    },
  },
});

interface Message {
  stringValue?: string;
  intValue?: number;
  value?: "stringValue" | "intValue"; // When oneofs: true
}

const codec = new ProtobufCodec<Message>(root.lookupType("Message"), {
  oneofs: true, // Include virtual oneof field
});

Maps

const root = protobuf.Root.fromJSON({
  nested: {
    Message: {
      fields: {
        metadata: { keyType: "string", type: "string", id: 1 },
      },
    },
  },
});

interface Message {
  metadata: Record<string, string>;
}

const codec = new ProtobufCodec<Message>(root.lookupType("Message"));

Schema Evolution

Protocol Buffers supports backward-compatible schema changes:

// Version 1
message User {
  int32 id = 1;
  string name = 2;
}

// Version 2 (backward compatible)
message User {
  int32 id = 1;
  string name = 2;
  string email = 3;  // New optional field
}

Old clients can read messages from new servers and vice versa.

Performance Tuning

For maximum performance in trusted environments:

const fastCodec = new ProtobufCodec<TrustedData>(MessageType, {
  verifyOnSerialize: false, // Skip verification
  defaults: false, // Don't include defaults
  oneofs: false, // Don't include oneof names
});

For maximum compatibility:

const safeCodec = new ProtobufCodec<AnyData>(MessageType, {
  verifyOnSerialize: true, // Verify before encoding
  defaults: true, // Include all fields
  longs: String, // Safe large integer handling
  enums: String, // Human-readable enums
});

Error Handling

The codec throws SerializationError from @procwire/transport for all serialization failures:

import { SerializationError } from "@procwire/transport";

try {
  const decoded = codec.deserialize(invalidBuffer);
} catch (error) {
  if (error instanceof SerializationError) {
    console.error("Serialization failed:", error.message);
    console.error("Original error:", error.cause);
  }
}

Common error scenarios:

  • Invalid input type (not Buffer/Uint8Array)
  • Truncated or corrupted buffer
  • Schema verification failure (when verifyOnSerialize: true)
  • Field type mismatches

Type Safety Tips

  1. Define interfaces matching your schema:
interface User {
  id: number;
  name: string;
  timestamp: string; // Use string for int64 with longs: String
}

const codec = new ProtobufCodec<User>(UserType, { longs: String });
  1. Match TypeScript types to options:
// With bytes: String
interface Message {
  data: string;
} // Base64

// With bytes: Array
interface Message {
  data: number[];
}

// Default
interface Message {
  data: Uint8Array;
}
  1. Use the type getter for reflection:
const codec = new ProtobufCodec<User>(UserType);
const fields = codec.type.fields; // Access schema info

Performance

Protocol Buffers provides excellent performance characteristics:

  • Compact Size: Typically 3-10x smaller than JSON
  • Fast Encoding/Decoding: More efficient than JSON parsing
  • Zero-Copy Optimization: Serialize without extra buffer copies
  • Schema Evolution: Forward and backward compatible

Ideal for:

  • High-performance microservices communication
  • Large data transfers
  • Long-term data storage
  • Cross-platform/cross-language IPC
  • APIs with versioned schemas

License

MIT