@titanpl/kafka
v0.0.1
Published
Kafka for Titan Planet Framework
Readme
@titanpl/kafka
Kafka for Titan Planet Framework, backed by Rust.
Install it, import it in your action file, and use the helpers directly.
Install
npm i @titanpl/kafkaBecause @titanpl/kafka is a native TitanPL extension, you must also allow it in your tanfig.json.
This is necessary so your TitanPL server can load and use the native extension, just like @titanpl/core.
{
"name": "test",
"description": "A powerful TitanPL project",
"version": "1.0.0",
"extensions": {
"allowWasm": true,
"allowNative": [
"@titanpl/core",
"@titanpl/kafka"
]
},
"build": {
"purpose": "test",
"files": [
"public",
"static",
"db",
"tanfig.json"
]
}
}Basic Use
Import the functions inside your Titan action file:
import { getMetadata, produce, consumeOnce } from "@titanpl/kafka";
export function getuser(req) {
const brokers = ["127.0.0.1:9092"];
const metadata = getMetadata({ brokers });
const written = produce({
brokers,
topic: "titan.kafka.test",
key: "user-1",
value: JSON.stringify({ hello: "kafka" }),
});
const messages = consumeOnce({
brokers,
topics: ["titan.kafka.test"],
groupId: "titan-kafka-group",
fallbackOffset: "earliest",
});
return {
metadata,
written,
messages,
};
}That is the normal usage:
npm i @titanpl/kafka- import in your action file
- call the functions
TitanPl Example
app/actions/getuser.js
import { produce } from "@titanpl/kafka";
export function getuser(req) {
const result = produce({
brokers: ["127.0.0.1:9092"],
topic: "users.created",
key: "user-42",
value: JSON.stringify({
id: 42,
name: "Titan",
}),
});
return {
ok: true,
result,
};
}app/actions/listusers.js
import { consumeOnce } from "@titanpl/kafka";
export function listusers(req) {
const result = consumeOnce({
brokers: ["127.0.0.1:9092"],
topics: ["users.created"],
groupId: "users-reader",
fallbackOffset: "earliest",
});
return {
ok: true,
result,
};
}API
validateConnection(options)
Checks broker config and returns broker probe results.
import { validateConnection } from "@titanpl/kafka";
const result = validateConnection({
brokers: ["127.0.0.1:9092"],
clientId: "orders-service",
});getMetadata(options)
Loads topic metadata from Kafka.
import { getMetadata } from "@titanpl/kafka";
const metadata = getMetadata({
brokers: ["127.0.0.1:9092"],
});produce(options)
Writes one UTF-8 record to Kafka.
import { produce } from "@titanpl/kafka";
const result = produce({
brokers: ["127.0.0.1:9092"],
topic: "orders.created",
key: "order-1001",
value: JSON.stringify({
id: 1001,
status: "created",
}),
requiredAcks: "all",
});consumeOnce(options)
Reads records with one poll cycle.
import { consumeOnce } from "@titanpl/kafka";
const result = consumeOnce({
brokers: ["127.0.0.1:9092"],
topics: ["orders.created"],
groupId: "orders-reader",
fallbackOffset: "earliest",
});version()
Returns the native crate version.
import { version } from "@titanpl/kafka";
const v = version();Local Kafka
TitanPl does not start Kafka for you. Kafka must already be running.
This repo includes a Docker setup for local testing on the same broker address used in the examples:
127.0.0.1:9092Start Kafka:
docker compose -f docker-compose.kafka.yml down -v --remove-orphans
docker compose -f docker-compose.kafka.yml up -dCheck topics:
docker exec --workdir /opt/kafka/bin -it titan-kafka-local sh -lc "./kafka-topics.sh --bootstrap-server localhost:9092 --list"The bundled local setup is meant to create:
titan.kafka.testNotes
- This package is for TitanPl and uses a Rust native backend.
produce()andconsumeOnce()use UTF-8 string payloads.produce()tries to create the topic automatically in the simple local flow.consumeOnce()is a simple one-poll read helper.- Kafka must be running before
getMetadata(),produce(), orconsumeOnce()can work.
