dawnlog
v0.2.0
Published
CLI tool that generates daily developer standup reports
Maintainers
Readme
dawnlog
A CLI tool that generates daily developer standup reports by reading your git commits across multiple repos, asking for your plan for today, and using an LLM to fill in a markdown template.
Installation
npm install -g dawnlogRequires Node.js 20+.
From source
git clone https://github.com/yyalim/dawnlog.git
cd dawnlog
npm install
npm run build
npm linkQuick Start
dawnlogOn first run, the setup wizard will guide you through:
- Adding your git repo paths
- Choosing an LLM provider (Ollama, Anthropic, or OpenAI)
- Setting your API key (if using a cloud provider)
- Configuring output location
Then every morning:
dawnlog
# Type your plan for today, press Ctrl+D
# Your standup report is saved to ~/dawnlogs/dawnlog-YYYY-MM-DD.mdCommands
dawnlog # Generate today's standup (interactive)
dawnlog --today "Review PRs" # Skip the interactive prompt
dawnlog --provider ollama # Override provider for this run
dawnlog --dry-run # Print prompts without calling the LLM or saving a file
dawnlog --since 2025-01-22 # Query commits from a specific date (e.g. after a bank holiday)
dawnlog --since 2025-01-22 --dry-run # Combine flags freely
dawnlog config # Show config help
dawnlog config --show # Print current configuration as JSON
dawnlog config --edit # Re-run the full setup wizard
dawnlog config --set <key>=<value> # Set a single config value
dawnlog config --add-repo <path> # Add a repo to the list
dawnlog config --remove-repo <path># Remove a repo from the list
dawnlog provider # Interactively switch the default LLM providerConfiguration
Config is stored at ~/.dawnlog/config.json. You can edit it directly or use dawnlog config.
{
"repos": ["/absolute/path/to/repo1", "/absolute/path/to/repo2"],
"llm": {
"provider": "ollama",
"model": "gemma4"
},
"outputDir": "~/dawnlogs",
"templatePath": "/path/to/templates/standup.md",
"author": "[email protected]",
"ticketBaseUrl": "https://yourco.atlassian.net/browse"
}| Field | Description | Default |
|-------|-------------|---------|
| repos | Absolute paths to git repositories to scan | [] |
| llm.provider | ollama, anthropic, or openai | ollama |
| llm.model | Model to use (provider-specific) | Provider default |
| llm.apiKey | API key (falls back to env var) | — |
| llm.baseUrl | Custom base URL (Ollama or OpenAI-compatible) | — |
| outputDir | Directory where reports are saved | ~/dawnlogs |
| templatePath | Path to the output markdown template | Bundled templates/standup.md |
| systemPromptPath | Path to the LLM system prompt template | Bundled templates/system-prompt.md |
| author | Filter git commits by author name or email | All authors |
| ticketBaseUrl | Base URL for ticket ID linkification | Disabled |
Editing config values
# Switch provider and model
dawnlog config --set llm.provider=ollama
dawnlog config --set llm.model=gemma4
# Update API key
dawnlog config --set llm.apiKey=sk-ant-...
# Set Ollama base URL (if not running on default port)
dawnlog config --set llm.baseUrl=http://localhost:11434
# Manage repos
dawnlog config --add-repo ~/projects/my-api
dawnlog config --remove-repo ~/old-project
# Other fields
dawnlog config --set author="Jane Doe"
dawnlog config --set ticketBaseUrl=https://myco.atlassian.net/browse
dawnlog config --set outputDir=~/Documents/standups
dawnlog config --set systemPromptPath=~/my-system-prompt.mdSettable keys: llm.provider, llm.model, llm.apiKey, llm.baseUrl, outputDir, templatePath, systemPromptPath, author, ticketBaseUrl
API Key Environment Variables
API keys can be set via environment variables instead of storing them in the config file:
ANTHROPIC_API_KEYfor AnthropicOPENAI_API_KEYfor OpenAI
LLM Providers
Ollama (default)
Runs models locally — no API key or internet connection needed.
Setup:
# Install Ollama from https://ollama.com, then:
ollama serve
ollama pull gemma4 # default — or mistral, llama3.1:8b, qwen2.5:7b, etc.Default model: gemma4. Default base URL: http://localhost:11434. Override with --set llm.baseUrl=... if needed.
Anthropic
Uses Claude models. Set ANTHROPIC_API_KEY or add it via the setup wizard.
Default model: claude-haiku-4-5.
dawnlog config --set llm.provider=anthropic
dawnlog config --set llm.model=claude-haiku-4-5OpenAI
Uses GPT models. Set OPENAI_API_KEY or add it via the setup wizard.
Supports a custom baseUrl for OpenAI-compatible APIs (Groq, Together, etc.).
dawnlog config --set llm.provider=openai
dawnlog config --set llm.model=gpt-4oAdding a Custom Provider
- Create
src/llm/myprovider.tsimplementing theLLMProviderinterface - Add a
case "myprovider"tosrc/llm/index.ts - Add
"myprovider"to the union type insrc/config.ts
Customizing Templates
There are two template files you can customize independently:
Output template (templates/standup.md)
Defines the structure of the generated report. Copy and edit it, then point dawnlog at it:
cp /path/to/dawnlog/templates/standup.md ~/my-standup.md
# Edit ~/my-standup.md to your liking
dawnlog config --set templatePath=~/my-standup.mdThe LLM fills in the placeholders — you can add, remove, or rename sections freely.
Placeholders:
| Placeholder | Description |
|-------------|-------------|
| {{YESTERDAY_DATE}} | Last working day's date (e.g. "Thu 02 Apr 2026") |
| {{TODAY_DATE}} | Today's date |
| {{YESTERDAY_SUMMARY}} | Synthesized summary of yesterday's commits, grouped by repo |
| {{TODAY_PLAN}} | Your plan for today |
| {{BLOCKERS}} | Blockers, or "None" |
| {{WORKLOAD}} | Inferred workload: Full / Half day / Morning only / etc. |
System prompt (templates/system-prompt.md)
Controls the instructions sent to the LLM. Useful for tuning output style or improving results with a specific local model:
cp /path/to/dawnlog/templates/system-prompt.md ~/my-system-prompt.md
# Edit ~/my-system-prompt.md to your liking
dawnlog config --set systemPromptPath=~/my-system-prompt.mdThe output template is passed to the LLM via the user message at runtime, so the system prompt is focused purely on instructions and examples.
Output
Reports are saved to ~/dawnlogs/dawnlog-YYYY-MM-DD.md. Example:
---
Yesterday (Thu 02 Apr 2026):
[my-api]
BWD-8682 — RBAC & permissions system
Backend:
- Defined 17 permissions and 4 roles with resolvePermissionsFromRoles
- Added requirePermissions middleware for payment admin routes
- Created GET /v1/internal/me/permissions endpoint
[dashboard]
BWD-9100 — Dashboard scaffold & dark mode
Frontend:
- Scaffolded Next.js app with Tailwind and shadcn/ui
- Added transactions table with pagination and date range filter
- Added dark mode toggle to dashboard settings
Today (Fri 03 Apr 2026):
Review open PRs and address feedback on BWD-8682
Blockers: None
Workload: Full
---Weekend & bank holidays
If you're generating a report after a long weekend or bank holiday, use --since to specify the start date manually:
dawnlog --since 2025-01-20 # picks up commits from Jan 20 onwardsDevelopment
npm run dev # Run with tsx (no build needed)
npm run build # Compile TypeScript → dist/
npm run lint # Type-check only
npm test # Run tests with Vitest
npm run test:watch # Watch modePlanned Features
- [ ]
dawnlog post— post output to Slack via webhook - [ ]
dawnlog week— weekly summary across all logs in outputDir - [ ]
npx dawnlogsupport (publish to npm)
License
MIT
