@logdbhq/node
v0.1.1-alpha.0
Published
LogDB SDK for Node.js — native gRPC client for shipping logs, beats, and cache entries to LogDB.
Maintainers
Readme
@logdbhq/node
LogDB SDK for Node, Deno, Bun, Cloudflare Workers, and Supabase Edge Functions. gRPC-Web over fetch — runs in any modern JavaScript runtime.
Status: v0.1.0-alpha — writer-only. Reader API, OpenTelemetry exporter, encryption, and platform-specific builders ship in later versions.
Install
npm install @logdbhq/nodeNode.js 20.10+, Deno, Bun, any runtime with native fetch and CompressionStream (all modern ones).
On Supabase Edge Functions / Deno Deploy, import via the npm: specifier:
import { LogDBClient, LogLevel } from "npm:@logdbhq/node@^0.1.0-alpha";Quick start
import { LogDBClient, LogLevel } from "@logdbhq/node";
const client = new LogDBClient({
apiKey: process.env.LOGDB_API_KEY!,
defaultApplication: "my-service",
defaultEnvironment: "production",
});
await client.log({
message: "user logged in",
level: LogLevel.Info,
userEmail: "[email protected]",
});
await client.flush();
await client.dispose();Or with await using (Node 20.10+):
{
await using client = new LogDBClient({ apiKey: process.env.LOGDB_API_KEY! });
await client.log({ message: "hello", level: LogLevel.Info });
}Fluent builder
import { LogEventBuilder, LogLevel } from "@logdbhq/node";
await LogEventBuilder.create(client)
.setMessage("payment processed")
.setLogLevel(LogLevel.Info)
.setUserEmail("[email protected]")
.setCorrelationId(traceId)
.addAttribute("amount_eur", 199.99)
.addAttribute("currency", "EUR")
.addLabel("payment")
.log();Heartbeats
import { LogBeatBuilder } from "@logdbhq/node";
await LogBeatBuilder.create(client)
.setMeasurement("cpu")
.addTag("host", "web-01")
.addField("usage_percent", 42.7)
.log();Cache
import { LogCacheBuilder } from "@logdbhq/node";
await LogCacheBuilder.create(client)
.setKey("user:42:profile")
.setValue({ name: "Alice", role: "admin" })
.log();Configuration
| Option | Default | Description |
|--------|---------|-------------|
| apiKey | — (required) | Account-scoped API key |
| serviceUrl | auto-discover | Override the gRPC endpoint (e.g. grpc-logger.logdb.com:443) |
| defaultCollection | "logs" | Default collection field stamped on logs |
| defaultApplication | undefined | Default application field |
| defaultEnvironment | "production" | Default environment field |
| enableBatching | true | Buffer entries and flush in batches |
| batchSize | 100 | Max entries per batch |
| flushInterval | 5000 | Max time (ms) an entry waits before flush |
| maxRetries | 3 | Retry attempts on transient failures |
| retryDelay | 1000 | Initial retry delay (ms) |
| retryBackoffMultiplier | 2.0 | Exponential backoff multiplier |
| enableCircuitBreaker | true | Trip circuit on repeated failures |
| circuitBreakerFailureThreshold | 0.5 | Failure rate (0..1) that trips |
| circuitBreakerSamplingDuration | 10000 | Sliding window (ms) |
| circuitBreakerDurationOfBreak | 30000 | Open state duration (ms) |
| enableCompression | true | Gzip-compress outbound payloads |
| requestTimeout | 30000 | Per-RPC deadline (ms) |
| headers | {} | Extra gRPC metadata |
| onError | undefined | (err, batch?) => void callback for failed sends |
Service discovery
If you don't pass serviceUrl, the SDK resolves the endpoint at first use via:
https://discovery.logdb.site/resolve/grpc-logger(withX-API-Keyheader)- Fallback: env var
LOGDB_GRPC_LOGGER_URL
Discovery returns a gRPC-Web URL like https://<tenant>.logdb.site/grpc-logger. Results are cached for 5 minutes per process.
Transport
- gRPC-Web over HTTP/1.1 or HTTP/2 (whatever your
fetchsupports). - No dependency on
@grpc/grpc-js— runs without Node HTTP/2 internals, so it works in Deno / Cloudflare Workers / Supabase Edge. - Gzip compression via
CompressionStreamforSendCompressed*RPCs. Enabled by default; toggle withenableCompression. - The SDK hits
<serviceUrl>/LogGrpcService/<Method>withcontent-type: application/grpc-web+proto. Status is read from the gRPC trailer frame (or response headers).
Error handling
The client surfaces errors in three ways:
// 1. Promise rejection from the call (when batching is disabled)
const status = await client.log({ message: "x" });
// → status === LogResponseStatus.NotAuthorized | Failed | CircuitOpen | Timeout
// 2. EventEmitter
client.on("error", (err) => {
console.error("LogDB error:", err);
});
// 3. options.onError callback
new LogDBClient({
apiKey: "...",
onError: (err, batch) => { /* ... */ },
});Typed error classes are exported for instanceof checks:
import { LogDBAuthError, LogDBNetworkError, LogDBCircuitOpenError } from "@logdbhq/node";
client.on("error", (err) => {
if (err instanceof LogDBAuthError) {
// bad API key
} else if (err instanceof LogDBNetworkError) {
// transient transport failure (already retried)
}
});Reading / querying
import { LogDBReader } from "@logdbhq/node";
const reader = new LogDBReader({ apiKey: process.env.LOGDB_API_KEY! });
const { items, totalCount } = await reader.getLogs({
application: "my-service",
level: "Error",
fromDate: new Date(Date.now() - 24 * 60 * 60 * 1000),
take: 50,
sort: { field: "timestamp", ascending: false },
});
const count = await reader.getLogsCount({ application: "my-service" });
const collections = await reader.getCollections();
const status = await reader.getEventLogStatus();
await reader.dispose();Methods: getLogs, getLogCaches, getLogBeats, getLogsCount, getCollections, getEventLogStatus. All return typed objects (JS Date, plain Record maps — no protobuf types leak). Reader uses a separate discovery service id (grpc-server) and opens its own channel lazily on first call.
Using this in a browser relay
This package is the engine behind @logdbhq/web's relay pattern. When a browser can't (or shouldn't) hold a LogDB API key, it POSTs JSON batches to a relay you deploy:
// supabase/functions/logdb-relay/index.ts (Deno)
import { LogDBClient } from "npm:@logdbhq/node@^0.1.0-alpha";
const client = new LogDBClient({
apiKey: Deno.env.get("LOGDB_API_KEY")!,
enableBatching: false, // browser already batches
});
Deno.serve(async (req) => {
const { type, items } = await req.json();
if (type === "log") await client.sendLogBatch(items);
return new Response(null, { status: 204 });
});Full template at @logdbhq/web/templates/supabase-edge-function/.
Documentation
- Examples:
./examples - LogDB documentation: https://docs.logdb.dev
License
MIT
