@niebaopeng/ocusage
v1.0.6
Published
Usage analysis tool for OpenCode / Kimi CLI sessions
Maintainers
Readme
ocusage
Usage analysis tool for Kimi CLI and OpenCode sessions. Inspired by ccusage.
ocusage reads your local session data, aggregates token usage, and estimates costs — all locally, with zero data leaving your machine.
Supported Data Sources
- Kimi CLI:
~/.kimi/sessions/<md5(cwd)>/<uuid>/wire.jsonl - OpenCode:
~/.local/share/opencode/opencode.db(SQLite)
Features
- 📊 Multiple Reports — daily, monthly, session, and project-level breakdowns (day × CLI × model)
- 💰 Cost Estimation — built-in pricing for Kimi, DeepSeek, MiniMax, GLM, etc.
- 🌐 External Pricing — fetch live prices from OpenRouter API or LiteLLM local cache
- 📍 Project Mapping — automatically resolves session directory hashes to actual project paths
- 🖥️ Statusline Integration — compact one-line output for shell prompts
- 🔒 100% Local — no network calls, no data upload
- 📦 Zero Runtime Dependencies — single-file executable (~17 KB)
Installation
# Run directly without installing
npx ocusage
# Or install globally
npm install -g ocusageUsage
# Daily report (default)
ocusage daily
# Monthly report
ocusage monthly
# Per-session breakdown
ocusage session
# Per-project breakdown
ocusage project
# Compact one-line summary (great for shell prompts)
ocusage statuslineOptions
| Option | Description |
|--------|-------------|
| --json | Output raw JSON |
| --since <YYYY-MM-DD> | Filter start date |
| --until <YYYY-MM-DD> | Filter end date |
| --project <name> | Filter by project path substring |
| --compact | Use compact table mode |
Examples
ocusage daily --since 2026-01-01 --json
ocusage session --project myapp --compact
ocusage monthly --project ocusageConfiguration
Create ~/.ocusage.json to customize pricing source or override specific model prices:
{
"pricingProvider": "openrouter",
"litellmPath": "~/.venv/lib/python3.12/site-packages/litellm/model_prices_and_context_window_backup.json"
}Pricing Providers
| Provider | Description |
|----------|-------------|
| "openrouter" | Fetch live prices from openrouter.ai/api/v1/models (default) |
| "litellm" | Read from local LiteLLM pricing JSON (auto-detected) |
| "default" | Use built-in hardcoded prices (zero network) |
Pricing Priority
~/.ocusage.jsonpricingfield (user manual override)- External provider (
openrouterorlitellm) - Built-in
DEFAULT_PRICING - Special rules (e.g., MiniMax highspeed = 3× standard)
Custom Model Prices
{
"pricing": {
"kimi-k2.5": {
"input": 0.50,
"output": 2.00,
"cacheRead": 0.10,
"cacheCreation": 0.50
}
}
}How It Works
Kimi CLI
- Scans
~/.kimi/sessions/<md5(cwd)>/<conversation>/wire.jsonl - Extracts
StatusUpdateevents containingtoken_usage - Default model is
k2.5when not explicitly recorded
OpenCode
- Reads
~/.local/share/opencode/opencode.dbvia built-innode:sqlite - Queries
messagetable for assistant responses with token data - Maps sessions to projects via
sessionandprojecttables
License
MIT
