npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2024 – Pkg Stats / Ryan Hefner

@meeshkanml/http-types-kafka

v0.0.2

Published

Kafka producer for recording HTTP traffic

Downloads

6

Readme

http-types-kafka

github oclif Version Downloads/week License

Tools for writing ts-http-types to Kafka in Node.js, powered by kafka.js.

The library is a wrapper around kafkajs so it must be installed as peer dependency.

Installation

kafkajs must be installed as peer dependency.

$ yarn add kafkajs http-types-kafka
# or
$ npm i kafkajs http-types-kafa

Quick start

First create the topic you're writing to:

$ kafka-topics.sh --bootstrap-server localhost:9092 --topic express_recordings --create --partitions 3 --replication-factor 1

Note that you may need to change script name depending on how you installed Kafka.

Create a HttpTypesKafkaProducer and connect to Kafka:

import { CompressionTypes, KafkaConfig, ProducerConfig } from "kafkajs";
import { HttpTypesKafkaProducer } from "http-types-kafka";

// Create a `KafkaConfig` instance (from kafka.js)
const brokers = ["localhost:9092"];
const kafkaConfig: KafkaConfig = {
  clientId: "client-id",
  brokers,
};

const producerConfig: ProducerConfig = { idempotent: false };

// Specify the topic
const kafkaTopic = "express_recordings";

// Create the producer
const producer = HttpTypesKafkaProducer.create({
  compressionType: CompressionTypes.GZIP,
  kafkaConfig,
  producerConfig,
  topic: kafkaTopic,
});

// Connect to Kafka
await producer.connect();

Send a single HttpExchange to Kafka:

const exchange: HttpExchange = ...;
await producer.send(exchange);

Send multiple HttpExchanges:

const exchanges: HttpExchange[] = ...;
await producer.sendMany(exchanges);

Send recordings from a JSON lines file, where every line is a JSON-encoded HttpExchange:

await producer.sendFromFile("recordings.jsonl");

Finally, disconnect:

await producer.disconnect();

Delete the topic if you're done:

$ kafka-topics.sh --bootstrap-server localhost:9092 --topic express_recordings --delete

Command-line interface

See available commands:

$ http-types-kafka

Producer

First create the destination topic in Kafka.

To send recordings from recordings.jsonl to Kafka, run:

$ http-types-kafka producer --file=recordings.jsonl --topic=my_recordings

Development

Install dependencies:

$ yarn

Build a package in lib:

$ yarn compile

Run tests:

$ ./docker-start.sh  # Start Kafka and zookeeper
$ yarn test
$ ./docker-stop.sh  # Once you're done

Package for npm:

$ npm pack

Publish to npm:

$ yarn publish --access public

Push git tags:

$ TAG=v`cat package.json | grep version | awk 'BEGIN { FS = "\"" } { print $4 }'`
# Tagging done by `yarn publish`
# git tag -a $TAG -m $TAG
$ git push origin $TAG

Working with local Kafka

First start kafka and zookeeper:

# See `docker-compose.yml`
docker-compose up

Create a topic called http_types_kafka_test:

docker exec kafka1 kafka-topics --bootstrap-server kafka1:9092 --topic http_types_kafka_test --create --partitions 3 --replication-factor 1

Check the topic exists:

docker exec kafka1 kafka-topics --bootstrap-server localhost:9092 --list

Describe the topic:

docker exec kafka1 kafka-topics --bootstrap-server localhost:9092 --describe --topic http_types_kafka_test

Using kafkacat

List topics:

kafkacat -b localhost:9092 -L

Push data to topic from file with snappy compression:

tail -f tests/resources/recordings.jsonl | kafkacat -b localhost:9092 -t http_types_kafka_test -z snappy

Consume messages from topic to console:

kafkacat -b localhost:9092 -t http_types_kafka_test -C