pm-assistant
v1.1.1
Published
PM assistant CLI: natural language to GitHub issues or Jira tickets via OpenAI or Anthropic Claude
Downloads
320
Maintainers
Readme
pm-assistant
A small PM assistant CLI: you describe work in natural language, an LLM (OpenAI or Anthropic Claude) turns it into structured tasks, and the tool creates GitHub Issues or Jira tickets (or previews them with --dry-run).
Requirements
- Node.js 22 or later (install the current Active LTS — Node.js 24 as of early 2026)
- Yarn (Classic v1 is fine)
- OpenAI API key or Anthropic API key (one required)
- GitHub: personal access token with
reposcope (only when not using--dry-run) - Jira Cloud: API token from id.atlassian.com (only when not using
--dry-run)
Setup
Clone or copy this project and install dependencies:
cd pm-assistant yarn installBuild TypeScript:
yarn buildConfigure your environment interactively:
yarn pm-assistant initThis asks which issue tracker (
githuborjira) and which LLM provider (openaioranthropic) you want, then prompts for the relevant credentials and writes a.envfile in the current directory. If a.envalready exists, only the values you provide are updated — other entries are preserved.OpenAI + GitHub (default):
yarn pm-assistant init --target=github --llm-provider=openai --openai-api-key=sk-... --github-token=ghp_... --owner=acme --repo=appAnthropic + Jira:
yarn pm-assistant init --target=jira --llm-provider=anthropic --anthropic-api-key=sk-ant-... --jira-host=your-domain.atlassian.net [email protected] --jira-api-token=... --jira-project-key=PROJOr provide some flags and answer prompts for the rest (hybrid mode).
Environment variables
| Variable | Required | Description |
|----------|----------|-------------|
| TARGET | No (default: github) | Issue tracker: github or jira |
| LLM_PROVIDER | No (default: openai) | LLM backend: openai or anthropic |
| LLM keys | | |
| OPENAI_API_KEY | When provider=openai | OpenAI API key |
| ANTHROPIC_API_KEY | When provider=anthropic | Anthropic API key |
| GitHub | | |
| GITHUB_TOKEN | When target=github, unless --dry-run | GitHub PAT with access to the repo |
| GITHUB_OWNER | When target=github, unless --dry-run* | Default owner when --owner is not passed |
| GITHUB_REPO | When target=github, unless --dry-run* | Default repo name when --repo is not passed |
| Jira Cloud | | |
| JIRA_HOST | When target=jira, unless --dry-run | e.g. your-domain.atlassian.net |
| JIRA_EMAIL | When target=jira, unless --dry-run | Atlassian account email |
| JIRA_API_TOKEN | When target=jira, unless --dry-run | API token from id.atlassian.com |
| JIRA_PROJECT_KEY | When target=jira, unless --dry-run | e.g. PROJ |
*For each of owner and repo, you may pass --owner=<name> / --repo=<name> on the command line instead of (or to override) the matching env var. Both values must be resolved before creating issues: if either is missing from flags and env, the CLI exits with a clear error.
With --dry-run, only your LLM provider's API key is required.
How to run
After yarn build, use the pm-assistant script:
yarn pm-assistant "create onboarding system with auth and dashboard"Target selection
By default the CLI uses the TARGET value from .env (default: github). Override at runtime with --target:
yarn pm-assistant --target=jira "create onboarding system with auth and dashboard"GitHub-specific flags
Use a specific repository (overrides GITHUB_OWNER / GITHUB_REPO for this run):
yarn pm-assistant --owner=my-org --repo=my-repo "create onboarding"You can mix flags and env: e.g. set GITHUB_OWNER in .env and pass only --repo=other-repo.
LLM provider selection
By default the CLI uses LLM_PROVIDER from .env (default: openai). Override at runtime:
yarn pm-assistant --llm-provider=anthropic "create onboarding system"Model selection — the alias maps depend on the provider:
| Alias | OpenAI (default) | Anthropic |
|-------|------------------|-----------|
| --model=smart | gpt-4o | Claude Sonnet |
| --model=fast | gpt-4o-mini | Claude Haiku |
You can also pass full model IDs directly (e.g. --model=gpt-4o-mini or --model=claude-sonnet-4-20250514).
Preview generated tasks without creating issues:
yarn pm-assistant --dry-run "add password reset flow and email templates"With --dry-run, if both owner and repo are available (from flags or env), the log includes the target repository for context.
Global CLI (pm-assistant)
After building, link the package so the pm-assistant binary is on your PATH:
yarn build
yarn link
pm-assistant init
pm-assistant "your feature description here"
pm-assistant --owner=acme --repo=product "your feature description here"
pm-assistant --dry-run "your feature description here"Alternatively use npm link from the project root.
Development (no build)
yarn dev runs the CLI with tsx (TypeScript execute) so you do not need yarn build first. This matches Node’s ESM rules, including .js extensions in imports that resolve to .ts sources.
yarn dev -- --dry-run "describe the work"Pass -- so Yarn forwards flags and the prompt to the script.
What it does
- Reads your request from the CLI.
- Calls the configured LLM (OpenAI or Anthropic, JSON mode) to produce a
tasksarray:title,description(Markdown: Context, Goal, Scope, Technical Notes, Acceptance Criteria; optional Out of Scope only when useful), andlabels. - Validates structure, then applies quality rules (single primary label, no FE/BE mix in scope, concrete acceptance criteria, sane scope/size).
- Creates one GitHub issue or Jira ticket per task, or logs them when
--dry-runis set.
Labels
Each issue gets exactly one primary label (never frontend and backend together):
frontend— UI, components, pages, styling, client statebackend— APIs, server-side logic, persistence, data storesinfra— setup, configuration, CI/CDtech-debt— refactoring or improvements
Ensure these labels exist in your GitHub repository (or Jira project), or the API may reject unknown labels (depending on repo/project settings).
Project layout
src/cli.ts— Entrypoint, subcommand routing (init/ run), env checks, target + provider dispatchsrc/envFile.ts—.envfile read/merge/write helpers forinitsrc/llm/— LLM abstraction layertypes.ts—LlmProvider,ChatMessage,LlmJsonClientinterfaceopenaiAdapter.ts— OpenAI SDK adapter (JSON object mode)anthropicAdapter.ts— Anthropic SDK adapter (Messages API)factory.ts—createLlmJsonClient(provider, apiKey)factoryindex.ts— barrel re-exports
src/generateTasks.ts— Task generation prompt, validation, repair pipelinesrc/generateQuestions.ts— Clarifying questions via LLMsrc/createIssues.ts— GitHub issue creation / dry-run loggingsrc/createJiraIssues.ts— Jira ticket creation / dry-run loggingsrc/github.ts— Octokit client factorysrc/jira.ts— Jira Cloud REST API client (native fetch + Basic auth)src/types.ts— Shared types and validation
