npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@lov3kaizen/agentsea-cli

v0.6.0

Published

CLI tool for AgentSea ADK - Build and orchestrate AI agents

Readme

@lov3kaizen/agentsea-cli

Command-line interface for AgentSea ADK - Build and orchestrate AI agents from your terminal.

npm version License: MIT

Features

  • 🚀 Quick Setup - Initialize with interactive prompts
  • 💬 Interactive Chat - Chat with agents in your terminal
  • 🤖 Agent Management - Create, list, and manage agents
  • 🔌 Provider Support - Cloud and local providers
  • 📦 Model Management - Pull and manage Ollama models
  • ⚙️ Configuration - Persistent configuration management
  • 🎨 Beautiful Output - Colored, formatted terminal output

Installation

# Global installation
npm install -g @lov3kaizen/agentsea-cli

# Or use with npx
npx @lov3kaizen/agentsea-cli init

Quick Start

1. Initialize

agentsea init

This will guide you through:

  • Choosing a provider (cloud or local)
  • Configuring API keys or endpoints
  • Creating a default agent

2. Start Chatting

agentsea chat

Interactive chat session with your default agent.

3. Run One-Off Commands

agentsea agent run default "What is the capital of France?"

Commands

agentsea init

Initialize AgentSea CLI configuration with interactive prompts.

agentsea init

agentsea chat

Start an interactive chat session.

agentsea chat                        # Use default agent
agentsea chat --agent my-agent       # Use specific agent
agentsea chat --model llama3         # Override model

agentsea agent

Manage agents.

# Create a new agent
agentsea agent create

# List all agents
agentsea agent list

# Get agent details
agentsea agent get <name>

# Run an agent with a message
agentsea agent run <name> "Your message"

# Set default agent
agentsea agent default <name>

# Delete an agent
agentsea agent delete <name>

agentsea provider

Manage providers.

# List all providers
agentsea provider list

# Get provider details
agentsea provider get <name>

# Add a new provider
agentsea provider add

# Set default provider
agentsea provider default <name>

# Delete a provider
agentsea provider delete <name>

agentsea model

Manage models (Ollama only).

# List available models
agentsea model list

# Pull a model from Ollama
agentsea model pull llama2

# Show popular models
agentsea model popular

agentsea config

Show current configuration.

agentsea config

Examples

Cloud Provider (Anthropic)

# Initialize with Anthropic
agentsea init
> Cloud Provider
> Anthropic
> [Enter API Key]

# Chat with Claude
agentsea chat

Local Provider (Ollama)

# Initialize with Ollama
agentsea init
> Local Provider
> Ollama
> http://localhost:11434

# Pull a model
agentsea model pull llama2

# Chat with local model
agentsea chat

Multiple Agents

# Create a coding assistant
agentsea agent create
> Name: coder
> Model: codellama
> System Prompt: You are a coding assistant...

# Create a writer assistant
agentsea agent create
> Name: writer
> Model: llama2
> System Prompt: You are a creative writer...

# Use specific agent
agentsea chat --agent coder

Configuration

Configuration is stored in:

  • Linux: ~/.config/agentsea-cli/config.json
  • macOS: ~/Library/Preferences/agentsea-cli/config.json
  • Windows: %APPDATA%\agentsea-cli\config.json

Configuration Structure

{
  "defaultProvider": "anthropic",
  "defaultAgent": "default",
  "providers": {
    "anthropic": {
      "name": "anthropic",
      "type": "anthropic",
      "apiKey": "sk-ant-...",
      "timeout": 60000
    },
    "ollama": {
      "name": "ollama",
      "type": "ollama",
      "baseUrl": "http://localhost:11434",
      "timeout": 60000
    }
  },
  "agents": {
    "default": {
      "name": "default",
      "description": "Default agent",
      "model": "claude-sonnet-4-20250514",
      "provider": "anthropic",
      "systemPrompt": "You are a helpful assistant.",
      "temperature": 0.7,
      "maxTokens": 2048
    }
  }
}

Environment Variables

API keys can also be set via environment variables:

export ANTHROPIC_API_KEY=your_key_here
export OPENAI_API_KEY=your_key_here
export GEMINI_API_KEY=your_key_here

Supported Providers

Cloud Providers

  • Anthropic - Claude models
  • OpenAI - GPT models
  • Google - Gemini models

Local Providers

  • Ollama - Local LLM runtime
  • LM Studio - GUI for local models
  • LocalAI - OpenAI-compatible local API

Ollama Integration

The CLI has first-class support for Ollama:

# Make sure Ollama is running
ollama serve

# Pull popular models
agentsea model pull llama2
agentsea model pull mistral
agentsea model pull codellama

# List available models
agentsea model list

# Show popular models
agentsea model popular

# Chat with local model
agentsea chat

Tips & Tricks

1. Quick Chat

Create an alias for quick access:

alias bc="agentsea chat"

2. Multiple Providers

Set up multiple providers for different use cases:

agentsea provider add
> Name: anthropic-prod
> Type: Anthropic

agentsea provider add
> Name: ollama-dev
> Type: Ollama

3. Specialized Agents

Create agents for specific tasks:

# Coding agent
agentsea agent create
> Name: code
> Model: codellama
> System Prompt: You are an expert programmer...

# Writing agent
agentsea agent create
> Name: write
> Model: llama2
> System Prompt: You are a creative writer...

# Use them
agentsea chat --agent code
agentsea chat --agent write

4. Verbose Mode

Get detailed output:

agentsea agent run default "Hello" --verbose

Troubleshooting

"No providers configured"

Run agentsea init to set up your first provider.

"Provider not found"

List providers: agentsea provider list

Add provider: agentsea provider add

"Agent not found"

List agents: agentsea agent list

Create agent: agentsea agent create

Ollama Connection Error

Make sure Ollama is running:

ollama serve

Check the base URL in your provider configuration:

agentsea provider get ollama

Model Not Found (Ollama)

Pull the model first:

agentsea model pull llama2

Development

# Install dependencies
pnpm install

# Build the CLI
pnpm build

# Link for local testing
pnpm link --global

# Test commands
agentsea--help

License

MIT License - see LICENSE for details

See Also


Built with ❤️ by lovekaizen