clippy-jaeger
v1.0.0
Published
CLI that translates natural language into Jaeger trace queries using AI
Readme
clippy-jaeger
CLI that translates natural language into Jaeger trace queries using AI (OpenRouter).
Stop memorizing Jaeger API parameters. Just describe what you want in plain English (or Portuguese) and get a ready-to-use query URL.
Quick Start
# Install dependencies
bun install
# Set your OpenRouter API key (free at https://openrouter.ai/keys)
export OPENROUTER_API_KEY="sk-or-v1-..."
# Run a query
bun run src/cli.ts "show errors in payment-service in the last hour"Installation
Requires Bun v1.0+.
git clone <repo-url>
cd clippy-jaeger
bun installGlobal install (optional)
bun link
# Now available everywhere:
clippy-jaeger "slow requests in api-gateway"Configuration
| Variable | Flag | Default | Description |
|---|---|---|---|
| OPENROUTER_API_KEY | --api-key | (required) | Your OpenRouter API key |
| JAEGER_URL | --jaeger-url | http://localhost:16686 | Jaeger instance base URL |
| OPENROUTER_MODEL | --model | openrouter/free | OpenRouter model ID |
You can set these as environment variables or pass them as CLI flags. Flags take priority.
.env file example
Create a .env file in the project root:
OPENROUTER_API_KEY=sk-or-v1-your-key-here
JAEGER_URL=http://jaeger.internal:16686
OPENROUTER_MODEL=openrouter/freeUsage
clippy-jaeger [options] <query>
Arguments:
query Natural language query describing the traces you want
Options:
-V, --version output the version number
--api-key <key> OpenRouter API key
--jaeger-url <url> Jaeger base URL
--model <model> OpenRouter model ID
--api <version> Jaeger API version: "legacy" or "v3" (default: "legacy")
--execute Execute the query and print results as JSON
--open Print the Jaeger UI URL to open in browser
--curl Print a ready-to-use curl command
--json Output only the raw JaegerQuery JSON
--verbose Show LLM translation details
-h, --help display help for commandExamples
Basic queries
# Find errors in a service
bun run src/cli.ts "show errors in payment-service in the last hour"
# Find slow requests
bun run src/cli.ts "slow requests over 2 seconds in api-gateway last 30 minutes"
# Filter by HTTP status code
bun run src/cli.ts "500 errors in user-service on /api/users endpoint"
# Get a specific number of traces
bun run src/cli.ts "last 50 traces from order-service"
# Duration range
bun run src/cli.ts "traces from auth-service between 500ms and 3s in the last 2 hours"Output formats
# Get just the raw JSON query (useful for piping)
bun run src/cli.ts "errors in checkout" --json
# Output: {"service":"checkout","tags":{"error":"true"},"lookback":"1h"}
# Get a curl command
bun run src/cli.ts "errors in checkout" --curl
# Output: curl -s 'http://localhost:16686/api/traces?service=checkout&...' | jq .
# Get the Jaeger UI URL
bun run src/cli.ts "errors in checkout" --open
# Use v3 API format
bun run src/cli.ts "errors in checkout" --api v3
# Execute the query directly (requires Jaeger running)
bun run src/cli.ts "errors in checkout" --executeCombining flags
# Full output with UI URL, curl, and execution
bun run src/cli.ts "slow requests in payment-service" --open --curl --execute
# Pipe JSON output to another tool
bun run src/cli.ts "errors in auth" --json | jq '.service'Sample Output
📋 Interpreted Query:
Service: payment-service
Tags: {"error":"true"}
Limit: 20
Lookback: 1h
🔗 API URL:
http://localhost:16686/api/traces?service=payment-service&limit=20&tags=%7B%22error%22%3A%22true%22%7D&start=1738875600000000&end=1738879200000000&lookback=1hHow It Works
- You type a natural language query
- The CLI sends it to OpenRouter (free AI model) with a specialized system prompt
- The LLM returns a structured JSON with Jaeger query parameters
- clippy-jaeger builds the correct API URL for your Jaeger instance
- Optionally executes the query or gives you a curl command
Supported Jaeger Query Parameters
| Parameter | Description | Example |
|---|---|---|
| service | Service name (required) | payment-service |
| operation | Operation/endpoint name | /api/checkout |
| tags | Key-value tag filters | {"error":"true","http.status_code":"500"} |
| minDuration | Minimum span duration | 500ms, 1s, 2m |
| maxDuration | Maximum span duration | 5s, 10s |
| limit | Max traces to return | 20, 50, 100 |
| lookback | Time window from now | 15m, 1h, 6h, 1d, 2d |
Jaeger API Versions
- Legacy (
--api legacy, default): Uses/api/traces— the internal API used by Jaeger UI - V3 (
--api v3): Uses/api/v3/traces— the newer OTLP-based API
Development
# Run tests
bun test
# Typecheck
bun run typecheck
# Run in watch mode
bun run dev "your query here"Project Structure
clippy-jaeger/
├── src/
│ ├── cli.ts # CLI entry point (commander)
│ ├── config.ts # Configuration management
│ ├── llm.ts # OpenRouter integration
│ ├── jaeger.ts # Jaeger URL builder & query execution
│ ├── prompt.ts # LLM system prompt
│ ├── config.test.ts # Config tests
│ └── jaeger.test.ts # Jaeger URL builder tests
├── index.ts # Programmatic API exports
├── package.json
├── tsconfig.json
└── README.mdProgrammatic Usage
You can also use clippy-jaeger as a library:
import { loadConfig, translateToQuery, buildLegacyUrl } from "clippy-jaeger";
const config = loadConfig({ openrouterApiKey: "sk-or-v1-..." });
const query = await translateToQuery("errors in payment last hour", config);
const url = buildLegacyUrl("http://localhost:16686", query);
console.log(url);Free Models
The default model is openrouter/free, which automatically routes to available free models. You can also specify a specific free model:
# Use a specific free model
bun run src/cli.ts "errors in payment" --model "nvidia/nemotron-3-nano-30b-a3b:free"License
MIT
