npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

gigaclaw

v1.9.2

Published

GigaClaw — Gignaati's autonomous AI agent platform. Build, deploy, and run AI agents 24/7 with a two-layer architecture: Next.js Event Handler + Docker Agent. India-first, edge-native AI.

Readme

GigaClaw

Autonomous AI Agent Platform — Powered by Gignaati

npm version License: MIT GitHub Stars Made in India

Build, deploy, and run autonomous AI agents 24/7.
India-first. Edge-native. Zero vendor lock-in.

Website · Documentation · Issues · Discussions


What is GigaClaw?

GigaClaw is a self-hosted, autonomous AI agent platform. You deploy it to your own server or VPS, and it runs 24/7 — responding to messages, executing scheduled jobs, handling webhooks, writing code, managing files, and completing complex multi-step tasks.

It is built on a two-layer architecture:

  • Event Handler — A Next.js server that handles real-time chat (web UI + Telegram), manages your agent's configuration, and creates jobs for the agent to execute.
  • Agent Engine — A Docker container that runs your agent jobs using GitHub Actions or a local Docker daemon. The agent can write code, run shell commands, browse the web, and interact with GitHub.

GigaClaw is the only autonomous agent platform with native PragatiGPT support — India's indigenous Small Language Model for edge deployment, delivering 100% data privacy and zero foreign cloud dependency.


One-Line Install

Linux / macOS

curl -fsSL https://raw.githubusercontent.com/gignaati/gigaclaw/main/install.sh | bash

Windows (PowerShell)

irm https://raw.githubusercontent.com/gignaati/gigaclaw/main/install.ps1 | iex

All Platforms (npm / npx)

# Create a new GigaClaw project
mkdir my-gigaclaw && cd my-gigaclaw
npx gigaclaw@latest init

# Then run the interactive setup wizard
npm run setup

Prerequisites: Node.js 18+, Docker, Git


Quick Start (5 Steps)

Step 1 — Create a new GitHub repository for your agent (e.g., my-gigaclaw).

Step 2 — Install GigaClaw into a local folder with the same name:

mkdir my-gigaclaw && cd my-gigaclaw
npx gigaclaw@latest init
npm install

Step 3 — Run the setup wizard:

npm run setup

The wizard will ask for your setup mode:

  • Hybrid (recommended) — Cloud + Local AI with smart routing
  • Cloud — GitHub + ngrok + Telegram, full features
  • Local — Ollama only, 100% offline

Step 4 — Start your agent:

docker compose up -d

Step 5 — Chat with your agent at your APP_URL.


Supported LLM Providers

GigaClaw supports 6 LLM providers — more than any other self-hosted agent platform:

| Provider | Description | Data Privacy | |---|---|---| | PragatiGPT | Gignaati's India-first SLM — edge-native, on-premise | 100% — no foreign cloud | | Ollama | Run any open-source model locally (Llama, Mistral, Qwen, Phi) | 100% — fully local | | Claude (Anthropic) | claude-opus-4, claude-sonnet-4, claude-haiku-4 | Anthropic's servers | | GPT (OpenAI) | gpt-5.2, gpt-4o, o4-mini | OpenAI's servers | | Gemini (Google) | gemini-3.1-pro, gemini-2.5-flash | Google's servers | | Custom API | Any OpenAI-compatible endpoint (vLLM, LM Studio, Together AI) | Depends on endpoint |

Set your provider in .env:

LLM_PROVIDER=pragatigpt   # India-first, edge-native
LLM_PROVIDER=ollama       # Fully local, zero cloud
LLM_PROVIDER=anthropic    # Claude (default)
LLM_PROVIDER=openai       # GPT
LLM_PROVIDER=google       # Gemini
LLM_PROVIDER=custom       # Any OpenAI-compatible API

Features

Agent Capabilities

  • Web Chat — Chat with your agent at your APP_URL
  • Telegram — Connect a Telegram bot with npm run setup-telegram
  • Scheduled Jobs — Cron-based recurring tasks via config/CRONS.json
  • Webhook Triggers — POST to /api/create-job to trigger jobs programmatically
  • Code Workspace — Full terminal and code editor in the browser
  • File Uploads — Upload images, PDFs, and text files to the chat

Agent Tools

  • Code execution — Write and run code in any language
  • Shell commands — Execute terminal commands
  • Web search — Search the internet for up-to-date information
  • GitHub integration — Create PRs, manage issues, push commits
  • File system — Read, write, and manage files in the repository

Infrastructure

  • Docker Compose — One-command deployment with Traefik reverse proxy
  • Auto SSL — Let's Encrypt certificates via Traefik
  • GitHub Actions — Agent jobs run in isolated Docker containers
  • Auto-merge — Agent can merge its own PRs after review
  • Hot reload — Push to main triggers automatic rebuild and restart

GigaClaw Exclusive Features

  • Hybrid Mode — Cloud + Local AI with smart per-task routing (v1.6.0)
  • PragatiGPT — India's indigenous SLM for edge deployment
  • Ollama — Run any open-source model with zero cloud dependency
  • Multi-LLM routing — Different LLMs for chat vs. agent jobs
  • Per-job LLM override — Specify llm_provider and llm_model per cron job

Hybrid Mode (New in v1.6.0)

Run both cloud and local LLMs simultaneously. GigaClaw automatically routes each task to the best provider.

npm run setup   # Choose "Hybrid Mode" (recommended)

Routing Strategies

| Strategy | Best for | |----------|----------| | Auto | Smart routing — complex tasks go to cloud, simple ones stay local | | Cost-Optimized | Minimize API costs — local by default, cloud only when needed | | Quality-First | Best output quality — cloud by default, local for drafts | | Privacy-First | Maximum data privacy — local by default, cloud only for complex tasks |

How it works

  1. Setup configures a cloud provider (Claude, GPT, Gemini, PragatiGPT) and a local provider (Ollama)
  2. Each message is scored for complexity and privacy sensitivity
  3. The task router picks the optimal provider based on your chosen strategy
  4. Ollama availability is auto-detected at runtime — no reconfiguration needed
# Example .env for hybrid mode
GIGACLAW_MODE=hybrid
LLM_PROVIDER=anthropic           # Cloud (primary)
LLM_MODEL=claude-sonnet-4-6
LOCAL_LLM_PROVIDER=ollama         # Local (secondary)
LOCAL_LLM_MODEL=llama3.2
HYBRID_ROUTING=auto               # auto | cost-optimized | quality-first | privacy-first

CLI Commands

npx gigaclaw init                              # Scaffold or update project files
npx gigaclaw setup                            # Run interactive setup wizard
npx gigaclaw setup-telegram                   # Configure Telegram bot
npx gigaclaw upgrade [@beta|version]          # Upgrade to latest version
npx gigaclaw reset-auth                       # Regenerate AUTH_SECRET
npx gigaclaw reset [file]                     # Restore a template file
npx gigaclaw diff [file]                      # Show differences vs. templates
npx gigaclaw set-agent-secret <KEY> [VALUE]   # Set GitHub secret (AGENT_ prefix)
npx gigaclaw set-agent-llm-secret <KEY> [VALUE] # Set LLM secret (AGENT_LLM_ prefix)
npx gigaclaw set-var <KEY> [VALUE]            # Set GitHub repository variable

Configuration Files

These files in config/ define your agent's personality and behavior. They are yours to customize — GigaClaw will never overwrite them:

| File | Purpose | |---|---| | SOUL.md | Your agent's identity, personality, and values | | JOB_PLANNING.md | How your agent plans and breaks down jobs | | JOB_AGENT.md | Instructions for executing jobs | | CRONS.json | Scheduled recurring jobs | | TRIGGERS.json | Webhook trigger definitions | | HEARTBEAT.md | Tasks for the periodic heartbeat cron |


Updating

npx gigaclaw upgrade          # Latest stable
npx gigaclaw upgrade @beta    # Latest beta
npx gigaclaw upgrade 1.2.72   # Specific version

Deployment

GigaClaw runs on any Linux server with Docker. Recommended:

| Provider | Spec | Monthly Cost | |---|---|---| | Hetzner CX22 | 2 vCPU, 4 GB RAM | ~€4 | | DigitalOcean Droplet | 2 vCPU, 4 GB RAM | ~$24 | | AWS EC2 t3.small | 2 vCPU, 2 GB RAM | ~$15 | | Your own hardware | Any Linux machine | ₹0 |

For local development, use ngrok to expose your machine:

ngrok http 80
# Then update APP_URL: npx gigaclaw set-var APP_URL https://your-url.ngrok.io

Privacy & Legal


Contributing

Contributions are welcome! Please read CONTRIBUTING.md before submitting a pull request.


Support


Built with care by Gignaati — India's Edge AI Ecosystem

Privacy Policy · Terms of Service · Security