quackscore
v0.1.9
Published
Analyze GitHub activity and generate a Duck RPG-style developer card
Maintainers
Readme
🦆 quackscore
"Every PR merged is a feather in your cap. Or your tail. Ducks aren't picky."
quackscore turns your GitHub pull request history into a Duck RPG-style developer card. It fetches your merged PRs, scores them, classifies them with an LLM, and generates a slick local HTML report — because your contributions deserve more than a green square.
npx quackscore create <your-github-username>That's it. Quack.
What it does
- Fetches your merged pull requests from GitHub
- Uses an LLM to classify each PR (complexity, area, type, technologies)
- Awards points and levels based on your contribution history
- Generates a local HTML developer card you can be proud of
- Stores history locally so incremental updates are fast
- Separates PR analysis from profile summary regeneration for fine-grained control
Requirements
- Node.js 18+
QUACKSCORE_GH_TOKEN— a GitHub personal access token. See GitHub fine-grained token setupQUACKSCORE_LLM_API_KEY— API key for your chosen LLM provider. See LLM provider setup. Not required for Ollama or unauthenticated LiteLLM proxies.
Install
npm install -g quackscoreOr run without installing:
npx quackscore --helpSupported LLM providers
| Provider | Value | Notes |
|---|---|---|
| Anthropic | anthropic | |
| OpenAI | openai | |
| Google | google | |
| OpenRouter | openrouter | |
| MiniMax | minimax | |
| Ollama (local) | ollama | No API key needed |
| LiteLLM proxy | litellm | Routes to Azure OpenAI, Bedrock, and more |
All remote providers use QUACKSCORE_LLM_API_KEY. Ollama needs no key. LiteLLM's key is optional (only needed if your proxy requires auth).
Need setup help? See LLM provider setup.
Quick start
1. Set environment variables
Need a GitHub token first? Follow GitHub fine-grained token setup.
export QUACKSCORE_GH_TOKEN=<your-github-token>
export QUACKSCORE_LLM_API_KEY=<your-llm-api-key>2. Pick your LLM provider
Need help choosing or configuring a provider? See LLM provider setup.
quackscore init --provider anthropic --model claude-haiku-4-5-20251001Config is saved to ~/.quackscore/config.json.
3. Generate your card
quackscore create <github-username>This always rebuilds from scratch — fetches all merged PRs, analyzes each one, generates your profile summary, and produces a fresh card. Any existing local profile is overwritten.
Filter by org or repo:
quackscore create <github-username> --organisation <org> --repository <repo>4. Update later (only new PRs)
quackscore update <github-username>Fetches only PRs merged since the last run, analyzes them, and updates your stats, level, and charts. Your existing profile summary is left untouched — use update-summary when you want to regenerate it.
5. Refresh your profile summary
quackscore update-summary <github-username>Regenerates the RPG title and written summary by sending your full PR history to the LLM. Does not re-fetch or re-analyze any PRs. Useful after a batch of update runs, or whenever you want a fresh take on your narrative.
6. Re-open a saved card
quackscore show <github-username>7. Leaderboard
quackscore leaderboard8. Try it without credentials
quackscore mock-reportGenerates a demo card from 20 hardcoded PRs. No GitHub or LLM key needed.
Other provider examples
# OpenAI
quackscore init --provider openai --model gpt-4o-mini
# Google
quackscore init --provider google --model gemini-2.0-flash
# OpenRouter
quackscore init --provider openrouter --model anthropic/claude-haiku-4
# MiniMax
quackscore init --provider minimax --model MiniMax-M2.7
# Ollama (local)
quackscore init --provider ollama --model llama3.2
# LiteLLM proxy — Azure OpenAI
quackscore init --provider litellm --model azure/gpt-4o-mini --base-url http://localhost:4000
# LiteLLM proxy — AWS Bedrock
quackscore init --provider litellm --model bedrock/claude-3-5-sonnet --base-url http://localhost:4000LiteLLM lets you run a single proxy that routes to Azure OpenAI, AWS Bedrock, Hugging Face, and many other backends. Start it with:
pip install litellm
litellm --model azure/gpt-4o-mini # or whichever backend you wantThen set your key if the proxy requires auth:
export QUACKSCORE_LLM_API_KEY=<your-litellm-master-key>Data storage
Everything lives in ~/.quackscore/:
| File | Contents |
|---|---|
| config.json | Provider and model settings |
| <username>.json | Analyzed PR data |
| <username>.html | Your generated developer card |
| leaderboard.json | Local leaderboard cache |
Notes
- The GitHub token is required regardless of LLM setup.
- API keys are read from environment variables only — never written to disk.
createalways rebuilds from scratch;updateonly fetches and analyzes new PRs without touching the existing summary;update-summaryregenerates the summary without touching PRs.- Add
--diagnosticsto any command for detailed internal tracing.
