@lov3kaizen/agentsea-cli
v0.6.0
Published
CLI tool for AgentSea ADK - Build and orchestrate AI agents
Maintainers
Readme
@lov3kaizen/agentsea-cli
Command-line interface for AgentSea ADK - Build and orchestrate AI agents from your terminal.
Features
- 🚀 Quick Setup - Initialize with interactive prompts
- 💬 Interactive Chat - Chat with agents in your terminal
- 🤖 Agent Management - Create, list, and manage agents
- 🔌 Provider Support - Cloud and local providers
- 📦 Model Management - Pull and manage Ollama models
- ⚙️ Configuration - Persistent configuration management
- 🎨 Beautiful Output - Colored, formatted terminal output
Installation
# Global installation
npm install -g @lov3kaizen/agentsea-cli
# Or use with npx
npx @lov3kaizen/agentsea-cli initQuick Start
1. Initialize
agentsea initThis will guide you through:
- Choosing a provider (cloud or local)
- Configuring API keys or endpoints
- Creating a default agent
2. Start Chatting
agentsea chatInteractive chat session with your default agent.
3. Run One-Off Commands
agentsea agent run default "What is the capital of France?"Commands
agentsea init
Initialize AgentSea CLI configuration with interactive prompts.
agentsea initagentsea chat
Start an interactive chat session.
agentsea chat # Use default agent
agentsea chat --agent my-agent # Use specific agent
agentsea chat --model llama3 # Override modelagentsea agent
Manage agents.
# Create a new agent
agentsea agent create
# List all agents
agentsea agent list
# Get agent details
agentsea agent get <name>
# Run an agent with a message
agentsea agent run <name> "Your message"
# Set default agent
agentsea agent default <name>
# Delete an agent
agentsea agent delete <name>agentsea provider
Manage providers.
# List all providers
agentsea provider list
# Get provider details
agentsea provider get <name>
# Add a new provider
agentsea provider add
# Set default provider
agentsea provider default <name>
# Delete a provider
agentsea provider delete <name>agentsea model
Manage models (Ollama only).
# List available models
agentsea model list
# Pull a model from Ollama
agentsea model pull llama2
# Show popular models
agentsea model popularagentsea config
Show current configuration.
agentsea configExamples
Cloud Provider (Anthropic)
# Initialize with Anthropic
agentsea init
> Cloud Provider
> Anthropic
> [Enter API Key]
# Chat with Claude
agentsea chatLocal Provider (Ollama)
# Initialize with Ollama
agentsea init
> Local Provider
> Ollama
> http://localhost:11434
# Pull a model
agentsea model pull llama2
# Chat with local model
agentsea chatMultiple Agents
# Create a coding assistant
agentsea agent create
> Name: coder
> Model: codellama
> System Prompt: You are a coding assistant...
# Create a writer assistant
agentsea agent create
> Name: writer
> Model: llama2
> System Prompt: You are a creative writer...
# Use specific agent
agentsea chat --agent coderConfiguration
Configuration is stored in:
- Linux:
~/.config/agentsea-cli/config.json - macOS:
~/Library/Preferences/agentsea-cli/config.json - Windows:
%APPDATA%\agentsea-cli\config.json
Configuration Structure
{
"defaultProvider": "anthropic",
"defaultAgent": "default",
"providers": {
"anthropic": {
"name": "anthropic",
"type": "anthropic",
"apiKey": "sk-ant-...",
"timeout": 60000
},
"ollama": {
"name": "ollama",
"type": "ollama",
"baseUrl": "http://localhost:11434",
"timeout": 60000
}
},
"agents": {
"default": {
"name": "default",
"description": "Default agent",
"model": "claude-sonnet-4-20250514",
"provider": "anthropic",
"systemPrompt": "You are a helpful assistant.",
"temperature": 0.7,
"maxTokens": 2048
}
}
}Environment Variables
API keys can also be set via environment variables:
export ANTHROPIC_API_KEY=your_key_here
export OPENAI_API_KEY=your_key_here
export GEMINI_API_KEY=your_key_hereSupported Providers
Cloud Providers
- Anthropic - Claude models
- OpenAI - GPT models
- Google - Gemini models
Local Providers
- Ollama - Local LLM runtime
- LM Studio - GUI for local models
- LocalAI - OpenAI-compatible local API
Ollama Integration
The CLI has first-class support for Ollama:
# Make sure Ollama is running
ollama serve
# Pull popular models
agentsea model pull llama2
agentsea model pull mistral
agentsea model pull codellama
# List available models
agentsea model list
# Show popular models
agentsea model popular
# Chat with local model
agentsea chatTips & Tricks
1. Quick Chat
Create an alias for quick access:
alias bc="agentsea chat"2. Multiple Providers
Set up multiple providers for different use cases:
agentsea provider add
> Name: anthropic-prod
> Type: Anthropic
agentsea provider add
> Name: ollama-dev
> Type: Ollama3. Specialized Agents
Create agents for specific tasks:
# Coding agent
agentsea agent create
> Name: code
> Model: codellama
> System Prompt: You are an expert programmer...
# Writing agent
agentsea agent create
> Name: write
> Model: llama2
> System Prompt: You are a creative writer...
# Use them
agentsea chat --agent code
agentsea chat --agent write4. Verbose Mode
Get detailed output:
agentsea agent run default "Hello" --verboseTroubleshooting
"No providers configured"
Run agentsea init to set up your first provider.
"Provider not found"
List providers: agentsea provider list
Add provider: agentsea provider add
"Agent not found"
List agents: agentsea agent list
Create agent: agentsea agent create
Ollama Connection Error
Make sure Ollama is running:
ollama serveCheck the base URL in your provider configuration:
agentsea provider get ollamaModel Not Found (Ollama)
Pull the model first:
agentsea model pull llama2Development
# Install dependencies
pnpm install
# Build the CLI
pnpm build
# Link for local testing
pnpm link --global
# Test commands
agentsea--helpLicense
MIT License - see LICENSE for details
See Also
- @lov3kaizen/agentsea-core - Core AgentSea ADK
- @lov3kaizen/agentsea-nestjs - NestJS integration
- AgentSea Documentation
Built with ❤️ by lovekaizen
