@kopai/app
v0.3.0
Published
Local OpenTelemetry backend for testing instrumentation - no Docker, no config, just npx
Downloads
408
Maintainers
Readme
@kopai/app
Local OpenTelemetry backend with an http/json Otel collector, Otel data storage and API to query the data.
Quick Start
npx @kopai/app startStarts two servers:
- API server on port 8000 (query traces, logs, metrics)
- OTEL collector on port 4318 (receives OTLP/HTTP data)
Kopai App server usage
npx @kopai/app <command>Commands
| Command | Description |
| ------- | ----------------- |
| start | Start the server |
| help | Show help message |
Options
| Option | Description |
| --------------- | ----------------- |
| -h, --help | Show help message |
| -v, --version | Show version |
Global Install (optional)
npm install -g @kopai/app
kopai-server startEnvironment Variables
| Variable | Default | Description |
| --------------------- | ----------- | ---------------------------- |
| SQLITE_DB_FILE_PATH | :memory: | Path to SQLite database file |
| PORT | 8000 | API server port |
| HOST | localhost | Host to bind |
Examples
# [In-memory](https://www.sqlite.org/inmemorydb.html) database (default)
npx @kopai/app start
# Persistent database [sqlite db path](https://nodejs.org/api/sqlite.html)
SQLITE_DB_FILE_PATH=./data.db npx @kopai/app start
# Custom port
PORT=3000 npx @kopai/app startEndpoints
- OTEL Collector -
localhost:4318- OTLP/HTTP endpoints - API Server -
localhost:8000- see /documentation for available endpoints
Sending Telemetry
Your application needs an OpenTelemetry SDK for your language.
Configure it to export OTLP/HTTP data to http://localhost:4318:
export OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318See OTLP Exporter Configuration for more details.
Example Workflow
1. Start Kopai
npx @kopai/app start2. Run your instrumented app
export OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318
export OTEL_SERVICE_NAME=my-app
node my-app.js3. Query your telemetry data
Search traces:
curl -X POST http://localhost:8000/signals/traces/search \
-H "Content-Type: application/json" \
-d '{"serviceName": "my-app"}'Get a specific trace:
curl http://localhost:8000/signals/traces/<traceId>Search logs:
curl -X POST http://localhost:8000/signals/logs/search \
-H "Content-Type: application/json" \
-d '{"serviceName": "my-app"}'Discover available metrics:
curl http://localhost:8000/signals/metrics/discoverSearch metrics:
curl -X POST http://localhost:8000/signals/metrics/search \
-H "Content-Type: application/json" \
-d '{"metricName": "http.server.duration"}'Query telemetry data using @kopai/cli (recommended)
@kopai/cli provides a simpler interface for querying data. It's also better suited for LLM agents.
# Search traces
npx @kopai/cli traces search --service my-app
# Get a specific trace
npx @kopai/cli traces get <traceId>
# Search logs
npx @kopai/cli logs search --service my-app
# Discover metrics
npx @kopai/cli metrics discover
# Search metrics
npx @kopai/cli metrics search --type Gauge --name http.server.durationSee @kopai/cli README for all available options.
