@crewx/cli
v0.8.0-rc.66
Published
SowonAI CrewX CLI - Bring Your Own AI(BYOA) team in Slack/IDE(MCP) with your existing subscriptions
Maintainers
Readme
SowonAI CrewX CLI
Bring Your Own AI (BYOA) team in Slack/IDE with your existing subscriptions. Transform Claude, Gemini, Codex, and Copilot into a collaborative development team.
Overview
SowonAI CrewX CLI is the full-featured command-line interface for SowonAI CrewX, providing:
- CLI Mode: Direct terminal usage with
crewx queryandcrewx execute - Slack Mode: Team collaboration with AI agents in Slack channels
- MCP Server Mode: IDE integration (VS Code, Claude Desktop, Cursor)
- Remote Agents: Distributed AI teams across projects
- Plugin System: Transform any CLI tool into an AI agent
Installation
npm install -g crewxQuick Start
# Initialize
crewx init
# Check system
crewx doctor
# Try it out
crewx query "@claude analyze my code"
crewx execute "@claude create a login component"Commands
query
Read-only analysis and information retrieval:
crewx query "@claude explain this function"
crewx query "@gemini search for latest news"
crewx query "@claude @gemini compare approaches"Options:
-t, --thread <id>: Continue conversation in thread-m, --model <model>: Override agent's default model--timeout <ms>: Set timeout in milliseconds--log: Enable debug logging
execute
Create or modify files and run operations:
crewx execute "@claude implement user authentication"
crewx execute "@gemini optimize performance"Options:
-t, --thread <id>: Continue conversation in thread-m, --model <model>: Override agent's default model--timeout <ms>: Set timeout in milliseconds--log: Enable debug logging
chat
Interactive conversation mode:
crewx chat
crewx chat --thread "my-session"Options:
-t, --thread <id>: Thread ID for conversation--agent <id>: Default agent to use--log: Enable debug logging
agent
Manage agents:
# List all agents (default behavior)
crewx agent
# List all agents (explicit)
crewx agent ls
crewx agent listinit
Initialize CrewX configuration:
crewx init
crewx init --template developmentCreates crewx.yaml with default agents.
doctor
Check system configuration:
crewx doctorVerifies:
- AI CLI tools installation
- Configuration file
- Agent availability
- API keys setup
log
View task execution logs:
# List all task logs (default behavior)
crewx log
# List all task logs (explicit)
crewx log ls
# View specific task log
crewx log task_1234567890_abcdefTask logs include:
- Execution status and duration
- Provider and agent information
- Full command output
- Error messages (if any)
mcp
Start MCP server for IDE integration:
crewx mcp
crewx mcp --logslack
Start Slack bot:
# Read-only mode
crewx slack
# Allow file modifications
crewx slack --mode execute
# With debug logging
crewx slack --logSee Slack Setup Guide for configuration.
templates
Manage knowledge templates:
crewx templates list
crewx templates info <template-name>Configuration
crewx.yaml
Create crewx.yaml in your project root:
agents:
- id: "frontend_dev"
name: "React Expert"
provider: "cli/claude"
working_directory: "./src"
inline:
type: "agent"
system_prompt: |
You are a senior React developer.
Provide detailed examples and best practices.
- id: "backend_api"
name: "API Specialist"
provider: "cli/gemini"
inline:
type: "agent"
system_prompt: |
You are an expert in REST API design.Provider Configuration
Built-in providers:
# Claude Code
provider: "cli/claude"
# Gemini CLI
provider: "cli/gemini"
# GitHub Copilot CLI
provider: "cli/copilot"
# Codex CLI
provider: "cli/codex"Plugin providers:
providers:
- id: "ollama"
type: "plugin"
cli_command: "ollama"
default_model: "llama3"
query_args: ["run", "{model}"]
prompt_in_args: false
agents:
- id: "local_llama"
provider: "plugin/ollama"Remote Agents
Connect to other CrewX instances:
providers:
- id: "backend_server"
type: "remote"
location: "http://api.example.com:3000"
external_agent_id: "backend_team"
agents:
- id: "remote_backend"
provider: "remote/backend_server"Usage Examples
Basic Queries
# Single agent
crewx query "@claude what is this code doing?"
# Multiple agents
crewx query "@claude @gemini compare these approaches"
# With model override
crewx query "@claude:opus analyze in detail"Execution
# Create files
crewx execute "@claude create a React component"
# Modify code
crewx execute "@gemini optimize this function"
# Multiple tasks
crewx execute "@claude create tests" "@gemini write docs"Pipeline Workflows
# Design then implement
crewx query "@architect design API" | \
crewx execute "@backend implement it"
# Multi-stage processing
cat requirements.txt | \
crewx query "@analyst prioritize" | \
crewx execute "@dev implement top 3"Thread-based Conversations
# Start conversation
crewx query "@claude design login" --thread "auth-feature"
# Continue conversation
crewx execute "@claude add password reset" --thread "auth-feature"
# Review conversation
crewx chat --thread "auth-feature"Parallel Execution
# Multiple agents simultaneously
crewx execute \
"@frontend create UI" \
"@backend create API" \
"@devops setup CI"Environment Variables
Slack Configuration
| Variable | Default | Description |
|----------|---------|-------------|
| SLACK_BOT_TOKEN | - | Slack Bot Token (required for Slack integration) |
| SLACK_SIGNING_SECRET | - | Slack Signing Secret (required for Slack integration) |
| SLACK_APP_TOKEN | - | Slack App Token (required for Socket Mode) |
| SLACK_MAX_RESPONSE_LENGTH | 400000 | Maximum total response length (characters) |
| SLACK_MAX_BLOCK_SIZE | 2900 | Maximum characters per block (max: 3000) |
CrewX Configuration
| Variable | Default | Description |
|----------|---------|-------------|
| CREWX_DEBUG | - | Enable debug logging |
| CREWX_CONFIG | - | Custom path to crewx.yaml configuration file |
| CREWX_PROVIDER | - | Default AI provider |
| CREWX_SLACK_LOG_CONVERSATIONS | false | Enable Slack conversation logging |
| CREWX_ENABLE_REMOTE_TEMPLATES | false | Enable remote template repository |
| CREWX_TEMPLATE_REPO | - | Remote template repository URL |
Provider Timeout Configuration (milliseconds)
| Variable | Default | Description |
|----------|---------|-------------|
| CREWCODE_TIMEOUT_CLAUDE_EXECUTE | - | Claude execute mode timeout |
| CREWCODE_TIMEOUT_CLAUDE_QUERY | - | Claude query mode timeout |
| CREWCODE_TIMEOUT_GEMINI_EXECUTE | - | Gemini execute mode timeout |
| CREWCODE_TIMEOUT_GEMINI_QUERY | - | Gemini query mode timeout |
| CREWCODE_TIMEOUT_COPILOT_EXECUTE | - | Copilot execute mode timeout |
| CREWCODE_TIMEOUT_COPILOT_QUERY | - | Copilot query mode timeout |
Other Configuration
| Variable | Default | Description |
|----------|---------|-------------|
| PORT | 3000 | MCP server port |
| DEBUG | - | Debug namespace (e.g., crewx:*) |
Usage Examples
Slack Bot with Custom Limits:
export SLACK_BOT_TOKEN=xoxb-your-token
export SLACK_SIGNING_SECRET=your-secret
export SLACK_APP_TOKEN=xapp-your-app-token
export SLACK_MAX_RESPONSE_LENGTH=500000
export SLACK_MAX_BLOCK_SIZE=2800
crewx slack --logCustom Configuration Path:
export CREWX_CONFIG=/path/to/custom/crewx.yaml
crewx query "@claude analyze this"Provider Timeout Configuration:
# Set longer timeout for complex execute tasks
export CREWCODE_TIMEOUT_CLAUDE_EXECUTE=600000 # 10 minutes
export CREWCODE_TIMEOUT_CLAUDE_QUERY=180000 # 3 minutes
crewx execute "@claude complex task"Debug Logging:
# Enable all CrewX debug logs
export DEBUG=crewx:*
export CREWX_DEBUG=true
crewx query "@claude test"Using .env File:
Create a .env.slack file:
SLACK_BOT_TOKEN=xoxb-your-token
SLACK_SIGNING_SECRET=your-secret
SLACK_APP_TOKEN=xapp-your-app-token
SLACK_MAX_RESPONSE_LENGTH=400000
SLACK_MAX_BLOCK_SIZE=2900
CREWX_SLACK_LOG_CONVERSATIONS=trueThen load it:
source .env.slack
npm run start:slackArchitecture
The CLI is built on top of @crewx/sdk:
packages/cli/
├── src/
│ ├── cli/ # Command handlers
│ │ ├── query.handler.ts
│ │ ├── execute.handler.ts
│ │ ├── chat.handler.ts
│ │ └── ...
│ ├── providers/ # CLI-specific provider utilities
│ │ ├── dynamic-provider.factory.ts # Security wrapper over SDK dynamic providers
│ │ └── logger.adapter.ts # Nest logger adapter for SDK providers
│ ├── services/ # Business logic
│ │ ├── ai.service.ts
│ │ ├── ai-provider.service.ts
│ │ ├── remote-agent.service.ts # Uses SDK RemoteAgentManager
│ │ └── ...
│ ├── slack/ # Slack integration
│ │ ├── slack-bot.ts
│ │ └── formatters/
│ │ └── message.formatter.ts # Extends SDK BaseMessageFormatter
│ ├── conversation/ # Conversation management
│ ├── guards/ # Security
│ └── utils/ # Utilities
└── tests/ # TestsSDK Integration
The CLI uses SDK components as a foundation, adding NestJS integration and platform-specific features:
Message Formatting
- SDK:
BaseMessageFormatterprovides core formatting logic - CLI:
SlackMessageFormatterextends SDK base with Slack-specific features (emoji, blocks, markdown)
AI Providers
- SDK:
BaseAIProvider,ClaudeProvider,GeminiProvider,CopilotProvider,CodexProvider - CLI: NestJS
@Injectable()wrappers that:- Inject NestJS logger via
LoggerAdapter - Integrate with
AIServiceandConfigService - Add platform-specific tool execution
- Inject NestJS logger via
Remote Agents
- SDK:
RemoteAgentManagerhandles remote communication - CLI:
RemoteAgentServicewraps SDK manager with:- NestJS dependency injection
- Agent configuration loading
- Tool name mapping
- MCP protocol integration
Benefits
- Reusable Core: SDK components work in any Node.js environment
- Clean Separation: Platform logic (NestJS, Slack) stays in CLI
- Testability: SDK tests verify core logic, CLI tests verify integration
- Extensibility: Custom integrations can use SDK directly
Development
Setup
# Install dependencies
npm install
# Build
npm run build
# Run CLI locally
npm run startTesting
# Run tests
npm test
# Watch mode
npm run test:watch
# With coverage
npm run test:coverageDebugging
# Debug mode
npm run debug
# With environment variables
dotenv -e .env.test -- npm run devAPI Integration
The CLI can be used as a library:
import { AIService, ConfigService } from 'crewx';
// Use in your Node.js application
const aiService = new AIService(/* ... */);
const result = await aiService.queryAI('prompt', 'cli/claude');Plugins
Creating a Plugin Provider
Add to crewx.yaml:
providers:
- id: "my_tool"
type: "plugin"
cli_command: "my-cli"
default_model: "default"
query_args: ["query"]
execute_args: ["execute"]
prompt_in_args: true
stdin: true
agents:
- id: "my_agent"
provider: "plugin/my_tool"Plugin Options
cli_command: Command to executedefault_model: Default model namequery_args: Arguments for query modeexecute_args: Arguments for execute modeprompt_in_args: Pass prompt as argumentstdin: Pass prompt via stdin
Slack Integration
Full Slack integration with:
- Thread-based conversations
- Agent mentions (@claude, @gemini, etc.)
- Team collaboration
- Read-only or execute mode
See Slack Setup Guide for details.
MCP Server
CrewX can run as an MCP server for IDE integration:
crewx mcpAdd to your IDE's MCP configuration:
{
"mcpServers": {
"crewx": {
"command": "crewx",
"args": ["mcp"]
}
}
}Troubleshooting
Common Issues
Command not found
npm install -g crewxConfiguration file not found
crewx initProvider not available
crewx doctorSee Troubleshooting Guide for more.
Performance
The CLI is optimized for:
- Parallel agent execution
- Efficient context management
- Minimal memory footprint
- Fast startup time
Security
- Bearer token authentication for remote agents
- Sandbox mode for plugin providers
- Security levels (low, medium, high)
- Execution mode guards
Documentation
- CLI Guide - Complete reference
- Agent Configuration - Configuration details
- Remote Agents - Distributed setup
- Template System - Knowledge management
- Template Variables - Dynamic variables and TemplateContext usage
- Context Integration Standard - TemplateContext pipeline and layout responsibilities
- Context Migration Guide - Upgrade steps for custom agents
- Layout DSL Reference - Layout fields, props, and helpers
- MCP Integration - IDE setup
Contributing
Contributions are welcome! Please follow these steps:
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests
- Run
npm testandnpm run build - Submit a pull request
See Contributing Guide for details.
License
MIT License - See LICENSE for details.
Support
Related Packages
@crewx/sdk- Core SDK for building custom integrations
Built by SowonLabs
