@baryodev/cli
v0.2.0
Published
CLI interface for BaryoDev agent
Maintainers
Readme
@baryodev/cli
Your autonomous AI coding assistant powered by local LLMs 🤖
Interactive command-line interface for BaryoDev - a local-first, privacy-focused AI development agent that runs entirely on your machine using Ollama.
Status
Version: 0.1.0 Stability: ✅ Production Ready Test Coverage: 480+ unit tests passing Build Status: ✅ Clean (zero TypeScript errors) Last Updated: 2026-01-03
Implementation Status
| Feature Category | Status | Notes | |-----------------|--------|-------| | ✅ Core CLI | Complete | Version, help, error handling | | ✅ Model Management | Complete | List, info, pull, remove, default | | ✅ Session Management | Complete | List, show, rename, delete, export | | ✅ Tools Management | Complete | Info, test, enable, disable | | ✅ Configuration | Complete | Show, set, unset, edit, reset | | ✅ BARYO.md Init | Complete | Framework detection, protection | | 🚧 Interactive Chat | Beta | Core working, advanced features pending | | 📋 Multi-Model Support | Planned | Phase 6 (Q2 2025) | | 📋 Plugin System | Planned | Phase 8 (Q3 2025) |
Quick Start
# 1. Install dependencies
pnpm install
# 2. Build the CLI
pnpm --filter @baryodev/cli build
# 3. Test Ollama connection
node packages/cli/dist/index.js connect
# 4. Start coding with AI
node packages/cli/dist/index.js chatPrerequisites
Required
- Node.js 20.0.0 or higher
- pnpm 8.0.0 or higher
- Ollama installed and running locally
Installing Ollama
# macOS
brew install ollama
# Linux
curl -fsSL https://ollama.com/install.sh | sh
# Start Ollama
ollama servePull Recommended Models
# Fast, efficient coder (recommended)
ollama pull qwen2.5-coder:7b
# Alternative models
ollama pull deepseek-coder:6.7b
ollama pull codellama:13b
ollama pull llama3.2:3bArchitecture
System Overview
graph TB
User[User]
CLI[CLI Package]
Core[Core Package]
Ollama[Ollama Server]
User -->|Commands| CLI
CLI -->|ReAct Loop| Core
Core -->|API Calls| Ollama
Ollama -->|Streaming Response| Core
Core -->|Tool Results| CLI
CLI -->|Display| User
style CLI fill:#e1f5ff
style Core fill:#fff3e0
style Ollama fill:#f3e5f5Package Dependencies
graph LR
CLI[cli]
Core[core]
Config[config]
Parser[parser]
Shared[shared]
Tools[tools]
CLI --> Core
CLI --> Config
Core --> Parser
Core --> Config
Core --> Tools
Core --> Shared
Config --> Shared
Parser --> Shared
Tools --> Shared
style CLI fill:#4CAF50,color:#fff
style Core fill:#2196F3,color:#fff
style Shared fill:#FF9800,color:#fffCommand Flow
sequenceDiagram
participant User
participant CLI
participant Core
participant Ollama
User->>CLI: baryodev chat "Create a REST API"
CLI->>Core: Initialize ReAct engine
Core->>Ollama: Stream LLM request
Ollama-->>Core: Streaming response
Core->>Core: Parse tool calls
Core->>Core: Execute tools
Core->>Ollama: Send tool results
Ollama-->>Core: Continue response
Core-->>CLI: Display results
CLI-->>User: Show outputFeatures
🎯 Core Capabilities
1. Interactive AI Chat
Real-time conversations with local LLMs
- Streaming responses for immediate feedback
- Context-aware multi-turn conversations
- Automatic context management (7-tier priority system)
- Token usage tracking and visualization
2. Model Management
Complete control over your LLM models
- List all available models
- View detailed model information
- Set default model
- Guided model pull/remove
3. Session Persistence
Never lose your work
- Save conversations with custom names
- Resume previous sessions
- Export to JSON, Markdown, or plain text
- Session metadata tracking
4. Project Configuration (BARYO.md)
AI understands your project
- Automatic framework detection
- Custom coding conventions
- Protected file patterns
- Project-specific context
5. Tool Ecosystem
Extensible tool system
- File operations (read, write, list)
- Command execution
- Pattern searching (grep)
- Custom tool development
6. Smart Configuration
Flexible config management
- Global and local configuration
- Hierarchical overrides
- Dot notation access
- Safety guardrails
Commands Reference
Core Commands
baryodev chat [options]
Start an interactive chat session with the AI agent.
# Basic usage
baryodev chat
# Use specific model
baryodev chat --model deepseek-coder:6.7b
# Configure context and iterations
baryodev chat --max-tokens 16384 --max-iterations 20
# Disable streaming
baryodev chat --no-streamOptions:
-m, --model <name>- LLM model to use (default: qwen2.5-coder:7b)--session <name>- Load previous session--auto-approve- Auto-approve read-only tools--max-iterations <n>- Max agent iterations (default: 10)--context-size <n>- Max context tokens (default: 32768)--stream- Enable streaming (default)--no-stream- Disable streaming responses--temperature <n>- Model temperature 0-2 (default: 0.7)
Slash Commands (in chat):
/help- Show available commands/model- List available models/model <name>- Switch model (requires restart)/clear- Clear conversation history/compact- Compress old messages/save <name>- Save current session/load <name>- Load saved session/tools- List available tools/exit- Exit chat
baryodev connect [options]
Test connection to Ollama server and list available models.
# Test default server
baryodev connect
# Test custom server
baryodev connect --url http://192.168.1.100:11434
# With timeout
baryodev connect --timeout 10000Options:
--url <url>- Ollama server URL (default: http://localhost:11434)--timeout <ms>- Connection timeout (default: 5000)
Output:
✔ Connected to Ollama
Server Information:
URL: http://localhost:11434
Models: 3
Available Models:
• qwen2.5-coder:7b
• deepseek-coder:6.7b
• llama3.2:3b
✓ Ollama is ready!Model Management
baryodev model list [options]
List all available models with optional details.
# Simple list
baryodev model list
# Detailed info
baryodev model list --detailed
# Custom Ollama server
baryodev model list --url http://localhost:11434Options:
--detailed- Show context length and descriptions--url <url>- Ollama server URL
baryodev model info <name>
Show detailed information about a specific model.
baryodev model info qwen2.5-coder:7bOutput:
Model: qwen2.5-coder:7b
Description: Ollama model: qwen2.5-coder:7b
Context Length: 32,768 tokens
ℹ For more details, use: ollama show qwen2.5-coder:7bbaryodev model default <name> [options]
Set default model in configuration.
# Set local default
baryodev model default qwen2.5-coder:7b
# Set global default
baryodev model default llama3.2:3b --globalOptions:
-g, --global- Set in global config (affects all projects)
baryodev model pull <name>
Guide for pulling a model from Ollama.
baryodev model pull codellama:13bOutput:
ℹ To pull a model, use the Ollama CLI:
ollama pull codellama:13b
Then run this command again to verifybaryodev model remove <name>
Guide for removing a model.
baryodev model remove old-model:7bSession Management
baryodev session list [options]
List all saved chat sessions.
# Simple list
baryodev session list
# Detailed view
baryodev session list --verboseOptions:
--verbose- Show message counts, models, dates
Output:
Saved Sessions:
• feature-implementation
Model: qwen2.5-coder:7b
Messages: 42
Created: 2026-01-03 10:30 AM
• bug-fix-session
Model: deepseek-coder:6.7b
Messages: 18
Created: 2026-01-02 3:15 PM
ℹ Total: 2 sessionsbaryodev session show <name> [options]
Show details of a saved session.
# Basic info
baryodev session show feature-implementation
# Full message history
baryodev session show feature-implementation --verbosebaryodev session rename <old> <new>
Rename a saved session.
baryodev session rename old-name new-namebaryodev session delete <name> [options]
Delete a saved session.
baryodev session delete unwanted-session --forceOptions:
--force- Confirm deletion (required for safety)
baryodev session export <name> [options]
Export session to file.
# Export as JSON
baryodev session export my-session --format json
# Export as Markdown
baryodev session export my-session --format md --output docs/chat.md
# Export as plain text
baryodev session export my-session --format txtOptions:
--format <fmt>- Export format: json, md, txt (default: json)--output <path>- Output file path (default: .)
Configuration Management
baryodev config show [key]
Display configuration.
# Show all config
baryodev config show
# Show specific value
baryodev config show ollama.defaultModel
# Show nested object
baryodev config show uiOutput:
{
"ollama": {
"baseUrl": "http://localhost:11434",
"defaultModel": "qwen2.5-coder:7b",
"timeout": 120000
},
"safety": {
"autoApproveReadOnly": true,
"blockedCommands": ["^rm\\s+-rf\\s+/", "^sudo\\s+rm"],
"maxExecTimeout": 30000
},
"ui": {
"theme": "auto",
"showTokenCount": true,
"streamResponses": true,
"useColors": true
}
}baryodev config set <key> <value> [options]
Set configuration value.
# Set local config
baryodev config set agent.maxIterations 20
# Set global config
baryodev config set ollama.defaultModel llama3.2:3b --globalOptions:
-g, --global- Set in global config-l, --local- Set in local config (default)
baryodev config unset <key>
Remove configuration value.
baryodev config unset test.valuebaryodev config edit [options]
Open config in default editor.
# Edit local config
baryodev config edit
# Edit global config
baryodev config edit --globalbaryodev config reset [options]
Reset configuration to defaults.
# Reset local config
baryodev config reset --local
# Reset global config
baryodev config reset --globalTools Management
baryodev tools list
List all available tools.
baryodev tools listOutput:
No tools registered yet
ℹ Tools will be available when running the agentbaryodev tools info <name>
Show detailed tool information.
baryodev tools info read_fileOutput:
Tool: read_file
Description: Read contents of a file
Parameters:
• path (required)
Type: string
File path to read
Risk Level: READ-ONLY
Requires Approval: Nobaryodev tools test <name> [options]
Test tool execution with parameters.
baryodev tools test read_file --params '{"path":"test.txt"}'baryodev tools enable <name>
Enable a tool (makes it available to the agent).
baryodev tools enable read_filebaryodev tools disable <name>
Disable a tool.
baryodev tools disable execute_commandProject Initialization
baryodev init [options]
Initialize BARYO.md for current project.
# Interactive initialization
baryodev init
# Minimal setup
baryodev init --minimal
# Force overwrite existing
baryodev init --forceOptions:
--minimal- Create minimal BARYO.md--force- Overwrite existing BARYO.md
Output:
✔ Analyzing project structure...
✓ Project initialized successfully!
Framework: react
BARYO.md: /Users/me/project/BARYO.md
Config dir: /Users/me/project/.baryodev
Next steps:
1. Review and customize BARYO.md
2. Add important files to the "Important Files" section
3. Define project-specific conventions
4. Run baryodev chat to start coding with AIHelp System
baryodev help [topic]
Show help for specific topic.
# List all topics
baryodev help
# Specific topic
baryodev help getting-started
baryodev help models
baryodev help contextAvailable Topics:
- getting-started
- models
- context
- sessions
- baryo
- tools
- config
- troubleshooting
- slash-commands
Use Cases
1. Feature Development
$ baryodev chat
> Create a REST API with Express.js for managing user profiles.
> It should have endpoints for CRUD operations.
💭 Agent analyzes requirements...
✅ I'll create a REST API for you. Here's my plan:
1. Set up Express.js server
2. Create user model
3. Implement CRUD endpoints
4. Add input validation
🔧 Creating package.json...
✔ File created
🔧 Creating server.js...
✔ File created
🔧 Creating routes/users.js...
✔ File created
✅ Done! Your REST API is ready.
- Run: npm install && node server.js
- API will be at http://localhost:30002. Bug Fixing
> There's a bug in calculateTotal function - it returns NaN for empty arrays
💭 Let me read the file and investigate...
🔧 Reading src/utils/calculator.js...
✔ File read
💡 Found the issue: The reduce function doesn't handle empty arrays.
🔧 Updating src/utils/calculator.js...
✔ File updated
✅ Fixed! Added default value of 0 to reduce function.
The function now correctly returns 0 for empty arrays.3. Code Review
> Review the authentication middleware and suggest improvements
💭 Analyzing code...
🔧 Reading middleware/auth.js...
✔ File read
📋 Security Review:
✅ Good practices:
- Using bcrypt for password hashing
- JWT token validation
⚠️ Issues found:
1. No rate limiting on login attempts
2. Token expiry not checked
3. Missing CORS configuration
💡 Recommendations:
1. Add express-rate-limit for login endpoint
2. Verify token expiration in middleware
3. Configure CORS for production
Would you like me to implement these fixes?4. Test Generation
> Write comprehensive tests for the user service
💭 Analyzing user service...
🔧 Reading src/services/userService.js...
✔ File read
🔧 Creating __tests__/userService.test.js...
✔ File created with 15 test cases
✅ Created tests for:
- User creation (5 tests)
- User retrieval (4 tests)
- User update (3 tests)
- User deletion (3 tests)
Run: npm testConfiguration
Configuration File Locations
Global: ~/.config/baryodev/config.json
Local: ./.baryodev/config.json
Precedence: Local overrides Global
Configuration Options
interface BaryoDevConfig {
// Ollama configuration
ollama: {
baseUrl: string; // Default: "http://localhost:11434"
defaultModel: string; // Default: "qwen2.5-coder:7b"
timeout: number; // Default: 120000 (ms)
};
// Safety configuration
safety: {
autoApproveReadOnly: boolean; // Default: true
blockedCommands: string[]; // Regex patterns
maxExecTimeout: number; // Default: 30000 (ms)
};
// UI configuration
ui: {
theme: "auto" | "light" | "dark"; // Default: "auto"
showTokenCount: boolean; // Default: true
streamResponses: boolean; // Default: true
useColors: boolean; // Default: true
};
// Context management
context: {
maxTokens: number; // Default: 32768
evictionThreshold: number; // Default: 0.8 (80%)
autoCompact: boolean; // Default: true
preserveRecentExchanges: number; // Default: 3
};
// Agent configuration
agent: {
maxIterations: number; // Default: 10
thinkingEnabled: boolean; // Default: true
verboseLogging: boolean; // Default: false
};
// Session configuration
session: {
autoSaveInterval: number; // Default: 300000 (5 min)
maxSessions: number; // Default: 100
compressionEnabled: boolean; // Default: true
};
}Known Issues & Limitations
Current Limitations
Interactive Chat Testing Pending
- Status: Beta
- Impact: Some advanced chat features need more testing
- Workaround: Basic chat works perfectly
- Fix: E2E testing framework planned for v0.2.0
Single Model Per Session
- Status: By Design
- Impact: Can't switch models mid-conversation
- Workaround: Use
/model <name>and restart chat - Fix: Multi-model support planned for Phase 6
Tool Registration at Runtime
- Status: Working, needs docs
- Impact: Custom tools require code changes
- Workaround: Edit tool registry manually
- Fix: Plugin system planned for Phase 8
Known Bugs
None reported! 🎉
Found a bug? Open an issue
Performance
Benchmarks (macOS, M1 Pro, Node 22)
| Command | Time | Target | Status |
|---------|------|--------|--------|
| --version | 33ms | 500ms | ✅ 15x faster |
| --help | 33ms | 1000ms | ✅ 30x faster |
| config show | 120ms | 1000ms | ✅ 8x faster |
| model list | 34ms | 3000ms | ✅ 88x faster |
| connect | 350ms | 5000ms | ✅ 14x faster |
All commands execute well under performance targets!
Resource Usage
- Memory: ~25-50MB baseline
- CPU: Minimal when idle
- Disk: <1MB for CLI, sessions vary
- Network: Only to Ollama (localhost)
Troubleshooting
"Failed to connect to Ollama"
Cause: Ollama server not running
Solution:
# Check if Ollama is running
curl http://localhost:11434/api/tags
# Start Ollama
ollama serve
# Verify in another terminal
baryodev connect"Model not found"
Cause: Model not pulled yet
Solution:
# List available models
ollama list
# Pull missing model
ollama pull qwen2.5-coder:7b
# Verify
baryodev model list"Context filling up too quickly"
Cause: Long conversation or large files
Solution:
# In chat, use compaction
/compact
# Or reduce context size
baryodev chat --max-tokens 16384
# Or adjust config
baryodev config set context.maxTokens 16384"Command not found: baryodev"
Cause: Not globally installed
Solution:
# Run directly
node packages/cli/dist/index.js <command>
# Or link globally
cd packages/cli
npm link
# Then use
baryodev <command>Development
Build & Test
# Install dependencies
pnpm install
# Build CLI
pnpm --filter @baryodev/cli build
# Watch mode (auto-rebuild)
pnpm --filter @baryodev/cli build:watch
# Run tests
pnpm --filter @baryodev/cli test
# Run directly without building
pnpm --filter @baryodev/cli dev chatProject Structure
packages/cli/
├── src/
│ ├── commands/ # Command implementations
│ │ ├── chat.ts # Interactive chat
│ │ ├── connect.ts # Connection testing
│ │ ├── config.ts # Config management
│ │ ├── init.ts # BARYO.md initialization
│ │ ├── model-subcommands.ts # Model management
│ │ ├── session-subcommands.ts # Session management
│ │ └── tools-subcommands.ts # Tools management
│ ├── ui/ # UI components
│ │ ├── colors.ts # Color utilities
│ │ ├── prompt.ts # Prompt formatting
│ │ ├── spinner.ts # Loading indicators
│ │ ├── help.ts # Help system
│ │ └── error-display.ts # Error formatting
│ ├── config/ # Configuration
│ │ ├── types.ts # Config types
│ │ └── manager.ts # Config CRUD
│ └── index.ts # Main entry point
├── dist/ # Built JavaScript
├── __tests__/ # Test files
└── README.md # This fileFuture Roadmap
v0.2.0 (Q1 2025)
- ✅ Complete E2E testing for chat
- 📋 Slash command improvements
- 📋 Enhanced session management
- 📋 Tool approval UI improvements
v0.3.0 (Q2 2025) - Multi-Model Support
- 📋 Switch models mid-conversation
- 📋 Model comparison mode
- 📋 Cloud provider integration (Claude, GPT-4, Gemini)
- 📋 Smart routing (cost/speed/quality)
v0.4.0 (Q2 2025) - Advanced Features
- 📋 Skills system (custom agent behaviors)
- 📋 Codebase indexing (AST parsing)
- 📋 Multi-file editing (atomic operations)
- 📋 Agentic workflows
v0.5.0 (Q3 2025) - Integration
- 📋 VS Code extension
- 📋 JetBrains plugin
- 📋 GitHub Actions integration
- 📋 CI/CD pipelines
v1.0.0 (Q4 2025) - Production Ready
- 📋 Plugin ecosystem
- 📋 Custom tool marketplace
- 📋 Team collaboration features
- 📋 Enterprise features
See ROADMAP.md for complete roadmap.
Security
Security Features
- ✅ Command injection prevention - Blocked dangerous patterns
- ✅ Path traversal protection - Sandboxed file operations
- ✅ BARYO.md protection - Enforced file restrictions
- ✅ Audit logging - All operations logged
- ✅ Local-first - No data sent to cloud
Security Best Practices
- Review BARYO.md - Define protected files
- Use safety config - Block dangerous commands
- Review tool approvals - Check before executing
- Keep Ollama updated - Latest security patches
- Audit logs - Review
.baryodev/audit.log
Security issue? Report to [email protected]
Contributing
We welcome contributions! See CONTRIBUTING.md for:
- Code style guide
- Testing requirements
- Pull request process
- Development workflow
License
GPL-3.0-or-later
Copyright © 2024-2025 BaryoDev Contributors
See Also
- Core Package - ReAct engine implementation
- Shared Types - Type definitions
- Parser - Tool call parsing
- Ollama - Local LLM runtime
- Documentation - Full documentation
Support
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- Documentation: docs/
- Examples: examples/
Made with ❤️ by the BaryoDev community
Local AI, global impact 🌍
