grafana-logs-to-csv
v1.0.0
Published
Convert Grafana-style log text files (tab-separated with JSON per line) to CSV
Readme
grafana-logs-to-csv
Convert Grafana-style log text files (tab-separated: timestamp, ISO date, JSON object per line) to CSV. CSV headers are the JSON keys; each row is one log line’s JSON values.
Install
npm install grafana-logs-to-csv
# or
pnpm add grafana-logs-to-csv
# or
yarn add grafana-logs-to-csvUsage
Single file
import { convertFile } from "grafana-logs-to-csv";
// Convert sample.txt → sample.csv in the same directory
const csvPath = await convertFile("./sample.txt");
console.log("Written:", csvPath);
// Write to a specific directory
await convertFile("./sample.txt", { outputDir: "./output" });
// Custom output file name (without extension)
await convertFile("./sample.txt", {
outputDir: "./output",
outputFileName: "my-logs",
});
// → output/my-logs.csvDirectory
import { convertDirectory } from "grafana-logs-to-csv";
// Convert all .txt files in a directory; CSVs next to each file
const paths = await convertDirectory("./logs");
console.log("Written:", paths);
// Write all CSVs into one folder
await convertDirectory("./logs", { outputDir: "./csv-output" });Options
| Option | Description |
| --------------- | ---------------------------------------------------------- |
| outputDir | Directory for CSV output (default: same as input). |
| outputFileName | Override output file name without extension (single file). |
| encoding | File encoding (default: "utf-8"). |
Input format
The text file should have data lines with three tab-separated fields:
- Unix timestamp (ms)
- ISO8601 date
- A JSON object (string keys and values)
Header/metadata lines (e.g. “1000 lines shown…”, “Common labels: …”) are skipped. Only the third field is used for the CSV; headers and columns are taken from the JSON keys.
Development
pnpm test– run tests (Vitest)pnpm run test:watch– run tests in watch modepnpm lint– run ESLintpnpm run lint:fix– fix lint issuespnpm format– format with Prettierpnpm run format:check– check formatting
Suggested improvements
- CLI: Add a small CLI (e.g. with
commanderoryargs) so users can runnpx grafana-logs-to-csv ./logs.txt --out-dir ./csvwithout writing a script. - Streaming: For very large files, read/parse line-by-line with a stream and write CSV rows incrementally to avoid loading the whole file into memory.
- Configurable extension: Allow
convertDirectoryto accept a custom file extension or glob (e.g..log) instead of only.txt. - Validation: Optionally validate that the third column is valid JSON and report line numbers for parse errors instead of silently skipping.
- CI: Add a GitHub Actions workflow to run
lint,format:check, andteston push/PR. - Changelog: Keep a
CHANGELOG.mdand follow semantic versioning when publishing.
Publish to npm
- Set
name,author, andrepositoryinpackage.json. - Log in:
npm login. - Build and publish:
pnpm run build && npm publish(ornpm publish --access publicfor a scoped package).
License
ISC
