skill-distill
v1.0.6
Published
Distill AI agent conversations into reusable Skills
Maintainers
Readme
Skill Distill 🪄
Skill Distill is a powerful CLI tool that transforms your AI agent conversation histories into reusable, high-quality "Skills". By analyzing successful interactions, it extracts the essence of complex tasks and generates structured documentation and configurations that can be instantly re-used by Claude Code, Codex CLI, and Cursor.
Why Skill Distill?
AI agents are excellent at solving complex problems, but the solutions often remain trapped within single chat sessions. When you encounter a similar problem later, you frequently have to start from scratch.
Skill Distill closes this loop by:
- Preserving Expertise: Captures the "how-to" from successful agent runs.
- Ensuring Consistency: Standardizes multi-step processes across your projects.
- Multi-Platform Portability: Converts skills between different AI agent formats effortlessly.
Features
- 🔄 Multi-Platform Support: Seamlessly work with Claude Code, Codex CLI, and Cursor.
- 🧠 Smart Extraction: Uses Claude API to analyze sessions and extract parameters, steps, and requirements.
- 🛠️ Automated Installation: Directly install generated skills into your local agent's project directory.
- 💬 Interactive Mode: Easily select sessions and provide additional context for the distillation process.
- 📂 Multiple Formats: Export skills to
claude,codex, orcursorformats. - 🛡️ Validation: Built-in validation ensures generated skills meet platform requirements.
Installation
Install Skill Distill globally via npm:
npm install -g skill-distillNote: Requires Node.js >= 18.0.0
Quick Start
Set your Anthropic API Key (required for LLM analysis):
export ANTHROPIC_API_KEY=sk-ant-api03-xxxxxList your recent agent sessions:
skill-distill listDistill your latest session:
skill-distill distill --lastOr initialize configuration for persistent settings:
skill-distill init
CLI Commands Reference
skill-distill distill [session]
The main command to extract a Skill from a session history.
| Option | Shorthand | Description |
| :--- | :--- | :--- |
| [session] | - | Optional session ID to distill |
| --last | -l | Use the most recent session from the default platform |
| --session <id> | -s | Specify a specific session ID |
| --prompt <text> | -p | Add custom user instructions (can be used multiple times) |
| --format <type>| -f | Output format: claude, codex, cursor, or all (Default: claude) |
| --output <dir> | -o | Target directory for generated files (Default: ~/.skills) |
| --install | - | Automatically install the skill to the local agent's path |
| --interactive | -i | Enable interactive session selection and prompt collection |
| --verbose | -v | Show detailed processing logs |
skill-distill list
List available sessions from your AI agent history.
| Option | Shorthand | Description |
| :--- | :--- | :--- |
| --limit <n> | -n | Number of sessions to display (Default: 20) |
| --platform <t>| - | Filter sessions by platform: claude, codex, cursor |
skill-distill init
Interactive setup to configure your environment, including:
- Default AI Agent platform
- Claude API Key (for LLM-powered distillation)
- Default output directory
- Auto-installation preferences
Configuration
Environment Variable (Recommended)
Set your Anthropic API key as an environment variable:
export ANTHROPIC_API_KEY=sk-ant-api03-xxxxxAdd this to your ~/.bashrc or ~/.zshrc for persistence.
Config File
Alternatively, run skill-distill init to create ~/.skill-distill/config.json:
{
"defaultPlatform": "claude",
"apiKey": "sk-ant-...",
"outputDir": "~/.skills",
"autoInstall": true
}- apiKey: Your Anthropic API key (can also use
ANTHROPIC_API_KEYenv var) - defaultPlatform: The agent platform you use most frequently
- outputDir: Where generated skill files are saved (default:
~/.skills)
Session Sources
Skill Distill automatically reads Claude sessions from:
~/.claude/transcripts/- OpenCode / Claude Code transcript files~/.claude/projects/*/- Claude Code project-specific sessions
Examples
Distill with extra context
If you want the distilled skill to focus specifically on certain aspects (e.g., error handling), you can provide extra prompts:
skill-distill distill --last -p "Ensure we include detailed error handling for API failures"Batch format export
Export a session as a skill compatible with all supported platforms:
skill-distill distill <session-id> --format all --output ./my-team-skillsInteractive selection
Don't remember the session ID? Use interactive mode:
skill-distill distill --interactiveOutput Formats
| Format | Extension | Target |
| :--- | :--- | :--- |
| Claude | .md | Claude Code Skills (Custom Instructions) |
| Codex | .yaml | Codex CLI Commands/Workflows |
| Cursor | .cursorrules| Cursor project-specific rules |
Development
We use pnpm for package management and tsup for building.
# Install dependencies
pnpm install
# Build the project
pnpm build
# Run in development mode
pnpm dev
# Run tests
pnpm testContributing
Contributions are welcome! Please feel free to submit a Pull Request. For major changes, please open an issue first to discuss what you would like to change.
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature) - Commit your Changes (
git commit -m 'Add some AmazingFeature') - Push to the Branch (
git push origin feature/AmazingFeature) - Open a Pull Request
License
Distributed under the MIT License. See LICENSE for more information.
