gyst-ai
v0.1.0
Published
Get your sh*t together — AI-powered developer toolkit for your terminal
Maintainers
Readme
What is gyst?
gyst is an all-in-one AI developer CLI that lives in your terminal. It translates natural language to git commands, reviews your code with auto-fix, roasts bad code, explains cryptic errors, generates standups from git history, writes READMEs, and explains codebases — all powered by the AI provider of your choice.
# Translate English to git
$ gyst git "undo my last commit but keep the changes"
🔧 git reset --soft HEAD~1
# AI code review with one-click fixes
$ gyst review src/ --fix
🔴 SQL Injection in auth.ts:42 → auto-fixed ✓
# Get roasted
$ gyst roast index.js --severity brutal
💀 Using var in 2026? This code is a time capsule.
# Explain any error
$ npm run build 2>&1 | gyst wtf
💡 Your PostCSS config references a plugin that isn't installed...
# Generate standup from git history
$ gyst standup --format slack
📋 *What I did:* Implemented auth flow, fixed 3 bugs...Installation
# npm
npm install -g gyst-ai
# pnpm
pnpm add -g gyst-ai
# yarn
yarn global add gyst-aiRequirements: Node.js >= 20
Quick Start
# 1. Set your API key (pick any provider)
export ANTHROPIC_API_KEY=sk-ant-...
# or
gyst config set key anthropic sk-ant-...
# 2. Start using it
gyst git "show me what changed this week"
gyst review src/
gyst roast app.js
gyst wtf "ECONNREFUSED 127.0.0.1:5432"
gyst standup
gyst readme --dry-run
gyst explain src/core/Commands
gyst git — Natural Language Git
Translates plain English instructions into the exact git command you need. Understands your repo context (current branch, status, remotes, stash) and warns you before running anything destructive.
$ gyst git "squash the last 3 commits"
┌─ 🔧 Git Command ───────────────────────┐
│ git rebase -i HEAD~3 │
│ Interactively squash the last 3 commits │
└─────────────────────────────────────────┘
Run this command? (Y/n):Features:
- Reads your repo state (branch, status, log, remotes, stash count)
- Warns about destructive commands (requires typing
yesinstead ofY) - Shows alternative commands when applicable
- Strips markdown fences from AI response for reliable parsing
gyst review — AI Code Review with Auto-Fix
Professional code review that finds real bugs, security issues, and anti-patterns — then provides exact code replacements you can auto-apply.
# Review a file
$ gyst review src/api/auth.ts
# Review an entire directory
$ gyst review src/
# Auto-apply all fixes
$ gyst review src/ --fix
# Only show critical and warning severity
$ gyst review src/ --severity warningFeatures:
- 4 severity levels:
critical,warning,suggestion,nitpick - Exact
currentCode→suggestedCodereplacements for auto-fix - Fixes applied in reverse line order to preserve positions
- Confirms before writing files, suggests
git diffafter --jsonoutput for CI integration
gyst roast — Code Roasting
Get a brutally honest (and funny) review of your code. Three severity levels from gentle encouragement to full Gordon Ramsay.
$ gyst roast src/utils.ts --severity brutal
💀 Roast mode: BRUTAL
🔥 Code Roast: utils.ts
Line 1: `var express = require('express')` — Using var in 2026?
This code is a time capsule. Someone call the archaeology department.
📊 Overall: 3/10
🎤 Final Verdict: This code doesn't just have bugs — it IS the bug.Severity levels:
gentle— Constructive with light humormedium— Comedy roast meets code review (default)brutal— Full Gordon Ramsay, no mercy
gyst wtf — Error Explanation
Explain any error message and get copy-paste-ready fix commands. Supports piping directly from your build tools.
# Pass an error directly
$ gyst wtf "ENOENT: no such file or directory, open './config.json'"
# Pipe from any command
$ npm run build 2>&1 | gyst wtf
$ cargo build 2>&1 | gyst wtf
$ python app.py 2>&1 | gyst wtfFeatures:
- Detects your project type for framework-specific advice
- Streaming output — see the explanation as it generates
- Supports stdin piping (
!process.stdin.isTTYdetection)
gyst standup — Standup Generation
Generate standup updates from your git history. Supports multiple output formats.
# Default: beautiful terminal output
$ gyst standup
# Slack-formatted for pasting
$ gyst standup --format slack
# JSON for automation
$ gyst standup --format json
# Look back 3 days
$ gyst standup --days 3
# Custom time range
$ gyst standup --since "last monday"Output includes:
- What you did (grouped by logical work items)
- What's next (inferred from patterns)
- Blockers (if any)
- Stats: commits, files changed, insertions, deletions
gyst readme — README Generation
Analyzes your entire codebase and generates a comprehensive README.
# Generate (saves as README.generated.md if README.md exists)
$ gyst readme
# Preview without writing
$ gyst readme --dry-run
# Different styles
$ gyst readme --style minimal # Just the essentials
$ gyst readme --style detailed # Full documentation (default)
$ gyst readme --style startup # Pitch-style, sells the project
# Custom output path
$ gyst readme --output DOCS.mdgyst explain — Code Explanation
Explain what code does in plain English. Works with single files or entire directories.
# Explain a file
$ gyst explain src/core/ai.ts
# Explain a directory (reads up to 8 files)
$ gyst explain src/core/Covers:
- Purpose and problem solved
- Key logic and data flow
- Design patterns recognized
- Dependencies and gotchas
gyst config — Configuration
Manage API keys, default models, and preferences.
# Show current config
$ gyst config show
# Set API keys
$ gyst config set key anthropic sk-ant-api03-...
$ gyst config set key openai sk-...
$ gyst config set key google AIza...
$ gyst config set key groq gsk-...
# Set default model
$ gyst config set model gpt-4o
$ gyst config set model claude-opus
# Set preferences
$ gyst config set preference roastSeverity brutal
$ gyst config set preference standupDays 3
# Reset everything
$ gyst config resetProviders
gyst supports 6 AI providers out of the box. Set your preferred provider's API key and go.
| Provider | Models | Env Variable | Best For |
|----------|--------|-------------|----------|
| Anthropic | claude-sonnet, claude-haiku, claude-opus | ANTHROPIC_API_KEY | Best overall quality |
| OpenAI | gpt-4o, gpt-4o-mini, gpt-4.1, gpt-4.1-mini | OPENAI_API_KEY | Fastest responses |
| Google | gemini-pro, gemini-flash | GOOGLE_GENERATIVE_AI_API_KEY | Free tier available |
| Groq | llama3 (Llama 3.1 70B) | GROQ_API_KEY | Fastest open model |
| DeepSeek | deepseek | DEEPSEEK_API_KEY | Budget-friendly |
| Ollama | local:<model-name> | — | Fully offline / private |
Using a Specific Provider
# Use a specific model
$ gyst git "status" --model gpt-4o
# Use local Ollama
$ gyst git "status" --local
# Override provider
$ gyst git "status" --provider openaiProvider Key Resolution
- Environment variable (e.g.,
ANTHROPIC_API_KEY) - Config file (
~/.gyst/config.jsonviagyst config set key) - Helpful error message pointing to
gyst config set key
Configuration
Config is stored at ~/.gyst/config.json. You can edit it directly or use gyst config set.
{
"defaultModel": "claude-sonnet",
"defaultProvider": "anthropic",
"keys": {
"anthropic": "sk-ant-..."
},
"ollamaBaseUrl": "http://localhost:11434",
"preferences": {
"roastSeverity": "medium",
"standupDays": 1,
"gitAutoExecute": false
}
}Global Flags
These flags work with every command:
| Flag | Description |
|------|-------------|
| -m, --model <model> | AI model to use (e.g., claude-sonnet, gpt-4o, local:llama3.2) |
| -p, --provider <provider> | Override provider (anthropic, openai, google, groq, ollama, deepseek) |
| --local | Use local Ollama model |
| --temperature <n> | Override temperature (0-1) |
| --max-tokens <n> | Override max output tokens |
| --json | Output as JSON (where supported) |
| --no-color | Disable colored output |
| -v, --verbose | Verbose output |
Project Architecture
src/
├── cli/
│ ├── index.ts # Commander.js setup, all commands
│ └── commands/ # One file per command
│ ├── git.ts # Natural language → git
│ ├── review.ts # Code review + auto-fix
│ ├── roast.ts # Code roasting
│ ├── wtf.ts # Error explanation
│ ├── standup.ts # Standup generation
│ ├── readme.ts # README generation
│ ├── explain.ts # Code explanation
│ └── config.ts # Config management
├── core/
│ ├── types.ts # All TypeScript interfaces
│ ├── ai.ts # Unified AI client (Vercel AI SDK)
│ ├── models.ts # Model registry + aliases
│ ├── context.ts # Project context detection
│ └── git-context.ts # Git repo state gathering
├── prompts/ # System prompts for each command
├── ui/
│ ├── theme.ts # Chalk color theme
│ ├── spinner.ts # Ora spinner wrapper
│ └── box.ts # Boxen box wrapper
├── utils/
│ ├── exec.ts # Shell execution
│ ├── detect-project.ts # Framework detection
│ └── file-tree.ts # Directory tree generator
└── storage/
└── config.ts # ~/.gyst/config.json managementTech Stack
- Runtime: Node.js >= 20, TypeScript (strict mode)
- Build: tsup (ESM output, Node 20 target)
- CLI Framework: Commander.js
- Terminal UI: chalk 5 + boxen 8 + ora 8
- AI: Vercel AI SDK (
ai+ provider packages) - Testing: Vitest with v8 coverage (182 tests, 96%+ coverage)
- Linting: Biome
Development
# Clone
git clone https://github.com/stanleycyang/gyst.git
cd gyst
# Install dependencies
pnpm install
# Build
pnpm build
# Run locally
node dist/cli/index.js --help
# Development (watch mode)
pnpm dev
# Run tests
pnpm test
# Run tests with coverage
pnpm vitest run --coverage
# Lint
pnpm lint
# Type check
pnpm typecheckLanding Page
The landing page lives in website/ and is a Next.js 16 app with Tailwind CSS.
cd website
pnpm install
pnpm dev # http://localhost:3000
pnpm build # Production buildTesting
182 tests across 26 test suites with 96%+ statement coverage.
Test Files 26 passed (26)
Tests 182 passed (182)
% Coverage report from v8
All files | 96.07% Stmts | 86.37% Branch | 100% Funcs | 96.07% LinesTests cover:
- All 7 system prompts
- All 8 CLI commands (with mocked AI calls)
- Core modules: model resolution, AI client, project context, git context
- Utils: shell exec, project detection, file tree generation
- Storage: config read/write/update, API key management
- UI: theme, spinner, box rendering
- Error handling and edge cases
# Run all tests
pnpm test
# Run with coverage report
pnpm vitest run --coverage
# Run specific test file
pnpm vitest run tests/core/models.test.tsFAQ
Q: Which AI provider should I use? A: Claude Sonnet (default) offers the best balance of quality and speed. GPT-4o is faster. Gemini has a generous free tier. Llama via Groq is the fastest open model. For privacy, use Ollama locally.
Q: Does it send my code to the cloud?
A: Yes, unless you use --local with Ollama. Your code is sent to whichever AI provider you configure. Use Ollama for fully offline, private operation.
Q: Can I use it in CI/CD?
A: Yes. Use --json for machine-readable output and set your API key via environment variable. Example: gyst review src/ --severity warning --json
Q: How does --fix work?
A: The AI returns exact currentCode → suggestedCode pairs. gyst does a String.replace(currentCode, suggestedCode) on your source files. It always confirms before writing and suggests git diff after.
License
MIT
