npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@theopenbee/cli

v0.0.32

Published

openbee CLI

Downloads

2,611

Readme

npm version npm downloads License Platform GitHub Stars

OpenBee is an around-the-clock digital worker solution, dedicated to making AI Agents your 7×24 always-on assistant.

✨ Features

| | | | | |:---:|:---:|:---:|:---:| | 🤖 AI Workers | 💬 Multi-IM Support | 🧠 Persistent Memory | ⏰ Scheduled Tasks | | Each Worker is an AI Agent capable of multi-step task planning and independent execution | Native support for Lark, DingTalk, WeCom, WeChat, and Telegram — receive and reply in the same conversation | Workers retain long-term memory across sessions, knowing context just like a real worker | Cron-based scheduling for automatic, hands-free triggering |

🤖 Supported AI Engines

OpenBee supports multiple AI engines as the underlying execution backend:

| Engine | Description | |:---:|:---| | Claude Code | Anthropic's official agentic coding tool; default and recommended engine | | Codex | OpenAI's Codex agent, supported via the plugin engine | | Pi | Pi agent, supported via the plugin engine | | Kimi | Moonshot AI's Kimi agent, supported via the plugin engine |

🚀 Quick Start

Step 1: Install

npm install -g @theopenbee/cli

The platform-specific binary is downloaded automatically. Supports Linux / macOS / Windows (amd64 & arm64).

curl -fsSL https://raw.githubusercontent.com/theopenbee/openbee/main/install.sh | bash

macOS (Homebrew):

brew install theopenbee/tap/openbee

Windows (Scoop):

scoop bucket add theopenbee https://github.com/theopenbee/scoop-bucket
scoop install theopenbee/openbee

Visit GitHub Releases, download the archive for your platform, extract it, and place the openbee executable in your PATH.

Step 2: Generate a config file

openbee config

The wizard will guide you through:

  • Claude executable path
  • IM platform(s) to enable (Lark / DingTalk / WeCom / WeChat / Telegram) and their credentials
  • Advanced options (can be skipped to use defaults)

The config file is written to config.yaml in the current directory by default. Use -o to specify a custom path:

openbee config -o /path/to/config.yaml

Step 3: Start the service

openbee server -d

Step 4: Start using

  • Open the Web Console (default http://localhost:8080) to manage Workers and view task status
  • Send messages directly in any configured IM platform (Lark / DingTalk / WeCom / WeChat / Telegram) to interact with OpenBee

⚙️ How It Works

graph TD
    A["💬 IM Layer (Communication)\nLark / DingTalk / WeCom / WeChat / Telegram"] --> B["🧠 Scheduling Layer\nAI Agent"]
    B --> C["🤖 Execution Layer\nAI Agents"]
    C -. "Reply Results" .-> A
    B -. "Reply Results" .-> A

OpenBee consists of three core layers:

1. IM Layer (Communication Layer) Includes Lark, DingTalk, WeCom, WeChat, and Telegram. Users send messages through these platforms to interact with OpenBee, and receive replies in the same conversation.

2. Scheduling Layer (AI Agent) Responsible for task scheduling — receives messages from the IM layer, understands user intent, and dispatches tasks to the Execution layer for execution. It can also reply results directly to the IM layer.

3. Execution Layer Each Worker is an independent AI Agent, equipped with persistent memory, tool invocation (CLI), and multi-step task planning. Workers execute assigned tasks autonomously and reply results directly to the IM layer — just like real workers.

🌟 Star History

Star History Chart

🤝 Community