npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

profclaw

v2.0.0

Published

Smart, lightweight AI agent engine -- run locally on anything from a $10 board to a VPS

Readme


Table of Contents


Why profClaw

Most AI coding tools are either cloud-only (Cursor, Devin) or single-purpose CLIs (Aider, SWE-Agent). profClaw is different:

  • Local-first - runs on your machine, your data stays local
  • Multi-agent orchestration - routes tasks to the right agent (Claude, GPT, Ollama) based on capability scoring, not just round-robin
  • Real task queue - BullMQ with dead letter queue, retry with backoff, priority scheduling
  • Built-in project management - tickets, Kanban, sprints, burndown charts, bi-directional sync with GitHub/Jira/Linear
  • 22 chat channels - talk to your AI through Slack, Discord, Telegram, WhatsApp, Teams, or 17 others
  • Cost tracking - per-token budget management with alerts at 50/80/100%
  • 72 built-in tools - file ops, git, browser automation, cron, web search, canvas, voice
  • Mode-aware - scales from a Raspberry Pi (pico) to a full team server (pro)

No other open-source tool combines task orchestration, project management, and cost tracking in one self-hosted package.

Features

| | | |---|---| | 35 AI providers | Anthropic, OpenAI, Google, Groq, Ollama, DeepSeek, Bedrock, and more | | 22 chat channels | Slack, Discord, Telegram, WhatsApp, iMessage, Matrix, Teams, and 15 more | | 72 built-in tools | File ops, git, browser automation, cron, web search, canvas, voice | | 50 skills | Coding agent, GitHub issues, Notion, Obsidian, image gen, and more | | MCP server | Native Model Context Protocol - connect to any MCP-compatible client | | Voice I/O | STT (Whisper) + TTS (ElevenLabs/OpenAI/system) + Talk Mode | | Plugin SDK | Build and share third-party plugins via npm or ClawHub | | 3 deployment modes | Pico (~140MB), Mini (~145MB), Pro (full features) |

Installation

npm (recommended)

npx profclaw onboard

This runs the zero-to-running wizard: picks your AI provider, sets up config, and starts the server.

Or install globally:

npm install -g profclaw@latest
profclaw setup
profclaw serve

Docker

docker run -d \
  -p 3000:3000 \
  -e ANTHROPIC_API_KEY=sk-ant-xxx \
  -e PROFCLAW_MODE=mini \
  ghcr.io/profclaw/profclaw:latest

Docker Compose

git clone https://github.com/profclaw/profclaw.git
cd profclaw
cp .env.example .env
docker compose up -d

Want free local AI? Add Ollama:

docker compose --profile ai up -d

One-liner (macOS/Linux)

curl -fsSL https://raw.githubusercontent.com/profclaw/profclaw/main/install.sh | bash

Deployment Modes

profClaw scales from a Raspberry Pi to a full production server:

| Mode | What you get | RAM | Best for | |------|-------------|-----|----------| | pico | Agent + tools + 1 chat channel + cron. No UI. | ~140MB | Raspberry Pi, $5 VPS, home server | | mini | + Dashboard, integrations, 3 channels | ~145MB | Personal dev server, small VPS | | pro | + All channels, Redis queues, plugins, browser tools | ~120-200MB | Teams, production |

Set via PROFCLAW_MODE=pico|mini|pro environment variable.

Where It Runs

| Hardware | RAM | Recommended mode | |----------|-----|-----------------| | Raspberry Pi Zero 2W | 512MB | pico | | Raspberry Pi 3/4/5 | 1-8GB | mini or pro | | Orange Pi / Rock Pi | 1-4GB | mini or pro | | $5/mo VPS (Hetzner, OVH) | 512MB-1GB | pico or mini | | Old laptop / home PC | 4-16GB | pro | | Docker (alongside other services) | 512MB+ | any mode | | Old Android phone (Termux) | 1-2GB | pico |

profClaw requires Node.js 22+. For bare-metal embedded devices (ESP32, Arduino), see MimiClaw (C) or PicoClaw (Go).

Configuration

The setup wizard (profclaw setup) handles everything interactively. Or set environment variables:

# AI Provider (pick one)
ANTHROPIC_API_KEY=sk-ant-xxx
OPENAI_API_KEY=sk-xxx
OLLAMA_BASE_URL=http://localhost:11434

# Deployment
PROFCLAW_MODE=mini
PORT=3000

# Optional
REDIS_URL=redis://localhost:6379   # Required for pro mode

See .env.example for all options.

AI Providers

35 providers with 90+ model aliases:

| Provider | Models | Local? | |----------|--------|--------| | Anthropic | Claude 4.x, 3.5 | No | | OpenAI | GPT-4o, o1, o3 | No | | Google | Gemini 2.x | No | | Ollama | Llama, Mistral, Qwen, ... | Yes | | AWS Bedrock | Claude, Titan, Llama | No | | Groq | Fast inference | No | | DeepSeek | V3, R1 | No | | Azure OpenAI | GPT-4o | No | | xAI | Grok | No | | OpenRouter | Any model | No | | Together | Open models | No | | Fireworks | Open models | No | | Mistral | Mistral Large, Codestral | No | | ... and 22 more | HuggingFace, NVIDIA NIM, Cerebras, Replicate, Zhipu, Moonshot, Qwen, etc. | |

Chat Channels

| Channel | Protocol | |---------|----------| | Slack | Bolt SDK | | Discord | HTTP Interactions | | Telegram | Bot API | | WhatsApp | Cloud API | | WebChat | SSE (browser-based) | | Matrix | Client-Server API | | Google Chat | Webhook + API | | Microsoft Teams | Bot Framework | | iMessage | BlueBubbles | | Signal | signald bridge | | IRC | TLS, RFC 1459 | | LINE | Messaging API | | Mattermost | REST API v4 | | DingTalk | OpenAPI + webhook | | WeCom | WeChat Work API | | Feishu/Lark | Open Platform | | QQ | Bot API | | Nostr | Relay protocol | | Twitch | Helix API + IRC | | Zalo | OA API v3 | | Nextcloud Talk | OCS API | | Synology Chat | Webhook |

Integrations

| Platform | Features | |----------|----------| | GitHub | Webhooks, OAuth, issue sync, PR automation | | Jira | Webhooks, OAuth, issue sync, transitions | | Linear | Webhooks, OAuth, issue sync |

Architecture

src/
  adapters/       AI agent adapters (tool calling, streaming)
  chat/           Chat engine + execution pipeline
    providers/    Slack, Discord, Telegram, WhatsApp, WebChat, ...
    execution/    Tool executor, sandbox, agentic loop
  core/           Deployment modes, feature flags
  integrations/   GitHub, Jira, Linear webhooks
  queue/          BullMQ (pro) + in-memory (pico/mini) task queue
  providers/      35 AI SDK providers
  skills/         Skill loader and registry
  mcp/            MCP server (stdio + SSE)
  types/          Shared TypeScript types

ui/src/           React 19 + Vite dashboard (mini/pro only)
skills/           Built-in skill definitions

Development

git clone https://github.com/profclaw/profclaw.git
cd profclaw
pnpm install
cp .env.example .env
pnpm dev

See CONTRIBUTING.md for full guidelines.

Security

See SECURITY.md for our security policy and how to report vulnerabilities.

License

AGPL-3.0