npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

dawnlog

v0.2.0

Published

CLI tool that generates daily developer standup reports

Readme

dawnlog

A CLI tool that generates daily developer standup reports by reading your git commits across multiple repos, asking for your plan for today, and using an LLM to fill in a markdown template.

Installation

npm install -g dawnlog

Requires Node.js 20+.

From source

git clone https://github.com/yyalim/dawnlog.git
cd dawnlog
npm install
npm run build
npm link

Quick Start

dawnlog

On first run, the setup wizard will guide you through:

  1. Adding your git repo paths
  2. Choosing an LLM provider (Ollama, Anthropic, or OpenAI)
  3. Setting your API key (if using a cloud provider)
  4. Configuring output location

Then every morning:

dawnlog
# Type your plan for today, press Ctrl+D
# Your standup report is saved to ~/dawnlogs/dawnlog-YYYY-MM-DD.md

Commands

dawnlog                            # Generate today's standup (interactive)
dawnlog --today "Review PRs"       # Skip the interactive prompt
dawnlog --provider ollama          # Override provider for this run
dawnlog --dry-run                  # Print prompts without calling the LLM or saving a file
dawnlog --since 2025-01-22         # Query commits from a specific date (e.g. after a bank holiday)
dawnlog --since 2025-01-22 --dry-run  # Combine flags freely

dawnlog config                     # Show config help
dawnlog config --show              # Print current configuration as JSON
dawnlog config --edit              # Re-run the full setup wizard
dawnlog config --set <key>=<value> # Set a single config value
dawnlog config --add-repo <path>   # Add a repo to the list
dawnlog config --remove-repo <path># Remove a repo from the list

dawnlog provider                   # Interactively switch the default LLM provider

Configuration

Config is stored at ~/.dawnlog/config.json. You can edit it directly or use dawnlog config.

{
  "repos": ["/absolute/path/to/repo1", "/absolute/path/to/repo2"],
  "llm": {
    "provider": "ollama",
    "model": "gemma4"
  },
  "outputDir": "~/dawnlogs",
  "templatePath": "/path/to/templates/standup.md",
  "author": "[email protected]",
  "ticketBaseUrl": "https://yourco.atlassian.net/browse"
}

| Field | Description | Default | |-------|-------------|---------| | repos | Absolute paths to git repositories to scan | [] | | llm.provider | ollama, anthropic, or openai | ollama | | llm.model | Model to use (provider-specific) | Provider default | | llm.apiKey | API key (falls back to env var) | — | | llm.baseUrl | Custom base URL (Ollama or OpenAI-compatible) | — | | outputDir | Directory where reports are saved | ~/dawnlogs | | templatePath | Path to the output markdown template | Bundled templates/standup.md | | systemPromptPath | Path to the LLM system prompt template | Bundled templates/system-prompt.md | | author | Filter git commits by author name or email | All authors | | ticketBaseUrl | Base URL for ticket ID linkification | Disabled |

Editing config values

# Switch provider and model
dawnlog config --set llm.provider=ollama
dawnlog config --set llm.model=gemma4

# Update API key
dawnlog config --set llm.apiKey=sk-ant-...

# Set Ollama base URL (if not running on default port)
dawnlog config --set llm.baseUrl=http://localhost:11434

# Manage repos
dawnlog config --add-repo ~/projects/my-api
dawnlog config --remove-repo ~/old-project

# Other fields
dawnlog config --set author="Jane Doe"
dawnlog config --set ticketBaseUrl=https://myco.atlassian.net/browse
dawnlog config --set outputDir=~/Documents/standups
dawnlog config --set systemPromptPath=~/my-system-prompt.md

Settable keys: llm.provider, llm.model, llm.apiKey, llm.baseUrl, outputDir, templatePath, systemPromptPath, author, ticketBaseUrl

API Key Environment Variables

API keys can be set via environment variables instead of storing them in the config file:

  • ANTHROPIC_API_KEY for Anthropic
  • OPENAI_API_KEY for OpenAI

LLM Providers

Ollama (default)

Runs models locally — no API key or internet connection needed.

Setup:

# Install Ollama from https://ollama.com, then:
ollama serve
ollama pull gemma4   # default — or mistral, llama3.1:8b, qwen2.5:7b, etc.

Default model: gemma4. Default base URL: http://localhost:11434. Override with --set llm.baseUrl=... if needed.

Anthropic

Uses Claude models. Set ANTHROPIC_API_KEY or add it via the setup wizard. Default model: claude-haiku-4-5.

dawnlog config --set llm.provider=anthropic
dawnlog config --set llm.model=claude-haiku-4-5

OpenAI

Uses GPT models. Set OPENAI_API_KEY or add it via the setup wizard. Supports a custom baseUrl for OpenAI-compatible APIs (Groq, Together, etc.).

dawnlog config --set llm.provider=openai
dawnlog config --set llm.model=gpt-4o

Adding a Custom Provider

  1. Create src/llm/myprovider.ts implementing the LLMProvider interface
  2. Add a case "myprovider" to src/llm/index.ts
  3. Add "myprovider" to the union type in src/config.ts

Customizing Templates

There are two template files you can customize independently:

Output template (templates/standup.md)

Defines the structure of the generated report. Copy and edit it, then point dawnlog at it:

cp /path/to/dawnlog/templates/standup.md ~/my-standup.md
# Edit ~/my-standup.md to your liking
dawnlog config --set templatePath=~/my-standup.md

The LLM fills in the placeholders — you can add, remove, or rename sections freely.

Placeholders:

| Placeholder | Description | |-------------|-------------| | {{YESTERDAY_DATE}} | Last working day's date (e.g. "Thu 02 Apr 2026") | | {{TODAY_DATE}} | Today's date | | {{YESTERDAY_SUMMARY}} | Synthesized summary of yesterday's commits, grouped by repo | | {{TODAY_PLAN}} | Your plan for today | | {{BLOCKERS}} | Blockers, or "None" | | {{WORKLOAD}} | Inferred workload: Full / Half day / Morning only / etc. |

System prompt (templates/system-prompt.md)

Controls the instructions sent to the LLM. Useful for tuning output style or improving results with a specific local model:

cp /path/to/dawnlog/templates/system-prompt.md ~/my-system-prompt.md
# Edit ~/my-system-prompt.md to your liking
dawnlog config --set systemPromptPath=~/my-system-prompt.md

The output template is passed to the LLM via the user message at runtime, so the system prompt is focused purely on instructions and examples.

Output

Reports are saved to ~/dawnlogs/dawnlog-YYYY-MM-DD.md. Example:

---
Yesterday (Thu 02 Apr 2026):

[my-api]
BWD-8682 — RBAC & permissions system
Backend:
- Defined 17 permissions and 4 roles with resolvePermissionsFromRoles
- Added requirePermissions middleware for payment admin routes
- Created GET /v1/internal/me/permissions endpoint

[dashboard]
BWD-9100 — Dashboard scaffold & dark mode
Frontend:
- Scaffolded Next.js app with Tailwind and shadcn/ui
- Added transactions table with pagination and date range filter
- Added dark mode toggle to dashboard settings

Today (Fri 03 Apr 2026):

Review open PRs and address feedback on BWD-8682

Blockers: None
Workload: Full
---

Weekend & bank holidays

If you're generating a report after a long weekend or bank holiday, use --since to specify the start date manually:

dawnlog --since 2025-01-20   # picks up commits from Jan 20 onwards

Development

npm run dev         # Run with tsx (no build needed)
npm run build       # Compile TypeScript → dist/
npm run lint        # Type-check only
npm test            # Run tests with Vitest
npm run test:watch  # Watch mode

Planned Features

  • [ ] dawnlog post — post output to Slack via webhook
  • [ ] dawnlog week — weekly summary across all logs in outputDir
  • [ ] npx dawnlog support (publish to npm)

License

MIT