@adddog/jira-github-manager-cli
v0.0.3
Published
CLI tool for syncing GitHub pull requests with Jira issues, supporting both Jira v2 and v3 APIs with LLM-powered issue generation
Downloads
6
Maintainers
Readme
Jira GitHub Manager CLI
CLI tool for managing Jira and GitHub integration with AI assistance via Gemini.
Prerequisites
llmCLI tool installed (pip install llmorbrew install llm)- Jira API token (stored in
.envasJIRA_API_TOKEN) - GitHub personal access token (stored in
.envasGH_SNIPPETS_TOKEN) - Google Gemini API key (stored in
.envasGEMINI_KEY)
Setup
1. Environment Variables
Create a .env file with:
JIRA_API_TOKEN=your_jira_token
GH_SNIPPETS_TOKEN=your_github_token
GEMINI_KEY=your_gemini_api_keyGet your keys:
- Jira: https://id.atlassian.com/manage-profile/security/api-tokens
- GitHub: https://github.com/settings/tokens
- Gemini: https://ai.google.dev/
config.yml
jira:
server:
address: https://company.atlassian.net
type: cloud
auth:
type: basic
user: [email protected]
defaults:
project: XXXX
board: ""
orgId: 45a1b0f9-50a8-4def-8936-c77c86e7cfac
github:
auth:
type: pat
defaults:
owner: "username"
repo: "repo_name"
ai:
provider: gemini
model: gemini-2.0-flash
options:
codeExecution: false
temperature: 0.1
g
automation:
createPrOnProgress: false
autoLinkIssues: true
branchTemplate: "{project}-{issue-number}-{summary}"
2. Install LLM and Gemini Plugin
# llm is already installed if you have it globally
# Install Gemini plugin
llm install llm-gemini3. Configure Gemini
# One-time setup
./setup-gemini.shThis script:
- Reads your
GEMINI_KEYfrom.env - Stores it in LLM's secure configuration
- Tests the connection to Gemini API
Usage
Test Gemini Connection
llm -m gemini-2.0-flash "Say hello"Interactive Chat with Gemini
llm chat -m gemini-2.0-flashUse Different Gemini Models
# Fastest, cheapest
llm -m gemini-1.5-flash-8b-latest "Your prompt"
# Balanced (default)
llm -m gemini-2.0-flash "Your prompt"
# Most capable
llm -m gemini-2.5-pro-preview-05-06 "Your prompt"Analyze Images
llm -m gemini-2.0-flash 'extract text' -a screenshot.png
llm -m gemini-2.0-flash 'describe' -a https://example.com/image.jpgCode Execution
llm -m gemini-2.0-flash -o code_execution 1 \
'use python to analyze this: [data]'With Claude Code Integration
# In Claude Code, use Context7 to fetch docs while leveraging Gemini
llm -m gemini-2.0-flash 'Based on this API docs: [context], write code to...'Configuration Files
.env- API credentials (not tracked in git).llmrc- LLM configuration (model defaults)setup-gemini.sh- Setup script for Gemini
Documentation
- Context7 Setup - Using Context7 with Claude
- Gemini Setup - Complete Gemini guide
- llm-gemini - Official plugin docs
- LLM - Main LLM tool docs
Models Available
| Model | Speed | Cost | Use Case | |-------|-------|------|----------| | gemini-1.5-flash-8b-latest | ⚡⚡⚡ | $ | Quick, lightweight tasks | | gemini-2.0-flash | ⚡⚡ | $$ | General purpose (default) | | gemini-2.5-pro-preview | ⚡ | $$$ | Complex reasoning |
Troubleshooting
"Key not found" error
# Re-run setup
./setup-gemini.sh
# Or manually verify key is set
llm keys listAPI errors
# Check if model is available
llm models list | grep gemini
# Test with explicit key
export LLM_GEMINI_KEY=$(grep GEMINI_KEY .env | cut -d= -f2)
llm -m gemini-2.0-flash "test"Rate limiting
Use a cheaper model:
llm -m gemini-1.5-flash-8b-latest "Your prompt"Next Steps
- Implement Jira integration functions
- Add GitHub integration
- Create CLI commands using Click or Typer
- Use Gemini for AI-powered features
Development Notes
- Never commit
.envfile - Keep API credentials secure
- Test changes with
llmCLI before building main app - Use Context7 for documentation-aware AI features
License
MIT
