@theopenbee/cli
v0.0.32
Published
openbee CLI
Downloads
2,611
Readme
OpenBee is an around-the-clock digital worker solution, dedicated to making AI Agents your 7×24 always-on assistant.
✨ Features
| | | | | |:---:|:---:|:---:|:---:| | 🤖 AI Workers | 💬 Multi-IM Support | 🧠 Persistent Memory | ⏰ Scheduled Tasks | | Each Worker is an AI Agent capable of multi-step task planning and independent execution | Native support for Lark, DingTalk, WeCom, WeChat, and Telegram — receive and reply in the same conversation | Workers retain long-term memory across sessions, knowing context just like a real worker | Cron-based scheduling for automatic, hands-free triggering |
🤖 Supported AI Engines
OpenBee supports multiple AI engines as the underlying execution backend:
| Engine | Description | |:---:|:---| | Claude Code | Anthropic's official agentic coding tool; default and recommended engine | | Codex | OpenAI's Codex agent, supported via the plugin engine | | Pi | Pi agent, supported via the plugin engine | | Kimi | Moonshot AI's Kimi agent, supported via the plugin engine |
🚀 Quick Start
Step 1: Install
npm install -g @theopenbee/cliThe platform-specific binary is downloaded automatically. Supports Linux / macOS / Windows (amd64 & arm64).
curl -fsSL https://raw.githubusercontent.com/theopenbee/openbee/main/install.sh | bashmacOS (Homebrew):
brew install theopenbee/tap/openbeeWindows (Scoop):
scoop bucket add theopenbee https://github.com/theopenbee/scoop-bucket
scoop install theopenbee/openbeeVisit GitHub Releases, download the archive for your platform, extract it, and place the openbee executable in your PATH.
Step 2: Generate a config file
openbee configThe wizard will guide you through:
- Claude executable path
- IM platform(s) to enable (Lark / DingTalk / WeCom / WeChat / Telegram) and their credentials
- Advanced options (can be skipped to use defaults)
The config file is written to config.yaml in the current directory by default. Use -o to specify a custom path:
openbee config -o /path/to/config.yamlStep 3: Start the service
openbee server -dStep 4: Start using
- Open the Web Console (default http://localhost:8080) to manage Workers and view task status
- Send messages directly in any configured IM platform (Lark / DingTalk / WeCom / WeChat / Telegram) to interact with OpenBee
⚙️ How It Works
graph TD
A["💬 IM Layer (Communication)\nLark / DingTalk / WeCom / WeChat / Telegram"] --> B["🧠 Scheduling Layer\nAI Agent"]
B --> C["🤖 Execution Layer\nAI Agents"]
C -. "Reply Results" .-> A
B -. "Reply Results" .-> AOpenBee consists of three core layers:
1. IM Layer (Communication Layer) Includes Lark, DingTalk, WeCom, WeChat, and Telegram. Users send messages through these platforms to interact with OpenBee, and receive replies in the same conversation.
2. Scheduling Layer (AI Agent) Responsible for task scheduling — receives messages from the IM layer, understands user intent, and dispatches tasks to the Execution layer for execution. It can also reply results directly to the IM layer.
3. Execution Layer Each Worker is an independent AI Agent, equipped with persistent memory, tool invocation (CLI), and multi-step task planning. Workers execute assigned tasks autonomously and reply results directly to the IM layer — just like real workers.
🌟 Star History
🤝 Community
- 🐛 Bug reports / Feature requests → GitHub Issues
- 🤝 Contributing → Please read the Contributing Guide. You must agree to the Contributor License Agreement (CLA) before submitting.
