npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@lincoln504/pi-research

v0.1.13

Published

A set of web research tools (search, scrape, and others) and research agent system that uses them safely and efficiently.

Readme

pi-research banner

Multi-agent web research extension for pi. Runs parallel AI researchers against a self-hosted SearXNG search container, scrapes pages, queries security databases, and tracks everything in a real-time terminal UI.

There are other ways to connect a pi session to the internet. Here is why this one is designed the way it is:

  • SearXNG-based — free, no API keys required by default. A single Docker container is shared across all concurrent research calls on the same machine; no duplicate instances, no per-session overhead. Point SEARXNG_URL at an existing instance to skip container management entirely.
  • Unified subagent system — web access, scraping, and synthesis are handled by isolated researcher agents. Research context stays out of the main chat session while still enabling thorough multi-source exploration.
  • Safety — researcher agents have no access to file writes, file edits, or shell commands. All web tools are scoped to the subagent system and executed in a structured, tool-limited sequence to keep agents focused and prevent runaway use.
  • Minimal integration — a system prompt hook provides light guidance to the host LLM on when and how to invoke the tool. No manual prompt engineering required. The extension does not conflict with other subagent or tool systems, and the unified toolset means no need to wire up or tune multiple separate tools.

Contents


Capabilities

  • Multi-agent deep research — AI coordinator decomposes a query into parallel research tracks; an AI lead evaluator synthesizes or delegates further rounds
  • Quick mode — single researcher for fast, focused queries with no coordinator overhead
  • Web search — SearXNG running in Docker, aggregating Bing, Brave, Yahoo, Qwant, DuckDuckGo, Wikipedia, GitHub, arXiv, and Semantic Scholar (engines can be enabled or disabled via config)
  • URL scraping — fetch-first with Playwright/Chromium fallback for JavaScript-heavy pages
  • Security databases — NVD, CISA KEV, GitHub Advisories, OSV
  • Stack Exchange — full network search and filtering
  • Real-time TUI — per-researcher token and cost tracking
── Research: 70% ─────────────────────────────────────
┌───────┬────┐ 1 ┌─────┬────┐ 2 ┌────┬────┐ 3 ┌────┐
│SearXNG│     18k      │     36k     │     50k     │
│:55732 │   $0.0056    │   $0.0055   │   $0.0071   │
└───────┴──────────────┴─────────────┴─────────────┘

Requirements

  • Node.js 22.13.0 or any 24.x+ release
  • The pi CLI installed and configured
  • Docker daemon running (required for SearXNG)
  • Internet access
  • LLM in pi with minimum 100k context window (optimized for 100k+)

Install

pi install npm:@lincoln504/pi-research

The installation will automatically download and install:

  • Node.js dependencies
  • Playwright browser binaries (Chromium)

This process may take a few minutes on first install. If browser installation fails during postinstall, chromium will be installed automatically when you first use the research tool.

Manual browser installation (if needed)

npm run install:browsers

From a checkout

pi install .

Platform Setup

macOS — Install Docker Desktop and start it from Applications.

Linux — Install Docker for your distribution (docs.docker.com/engine/install), then:

# Start Docker
sudo systemctl start docker
sudo systemctl enable docker

# Add your user to docker group (then log out and back in)
sudo usermod -aG docker $USER

Windows — Install Docker Desktop (Linux containers). Detecting named pipes is automatic.

Note: The first time you use pi install on a fresh system, Docker may not be running yet. The extension will check Docker when you actually use the research tool, not during install.


Usage

Ask pi for web research. The extension registers a research tool.

Research "What is a binary search tree?"
Do a deep dive on CVE-2024-3094 at depth 2

| Depth | Mode | Researchers | Target rounds | |-------|------|-------------|---------------| | 0 (default) | Quick | 1 | 1 | | 1 | Normal | 2 | 2 | | 2 | Deep | 3 | 3 | | 3 | Ultra | 5 | 5 |


Configuration

| Variable | Default | Description | |----------|---------|-------------| | SEARXNG_URL | — | External SearXNG base URL (e.g. http://localhost:8080). When set, Docker container management is skipped entirely and this URL is used directly | | BRAVE_SEARCH_API_KEY | — | Enables braveapi engine (official REST API) | | PI_RESEARCH_VERBOSE | — | Set to 1 to write diagnostic JSONL logs to a temp file (see below) | | PI_RESEARCH_SKIP_HEALTHCHECK | — | Skip startup health check (set to 1 if engines are blocked) | | PROXY_URL | — | Proxy for SearXNG outgoing requests (e.g. socks5://127.0.0.1:9050) | | STACKEXCHANGE_API_KEY | — | Stack Exchange API key (increases limit from 300 to 10,000/day) | | PI_RESEARCH_SEARCH_LANGUAGE | en-US | BCP 47 language tag for search result filtering | | PI_RESEARCH_RESEARCHER_TIMEOUT_MS | 240000 | Per-researcher timeout (30s to 10m) | | PI_RESEARCH_MAX_CONCURRENT_RESEARCHERS | 3 | Maximum researchers running simultaneously (1 to 10) | | PI_RESEARCH_CONSOLE_RESTORE_DELAY_MS | 15000 | Delay before TUI clears after research ends | | SEARXNG_IMAGE_TAG | latest | SearXNG Docker image tag — pin to a specific release for reproducibility (e.g. 2024.12.23) |

Verbose logging

Set PI_RESEARCH_VERBOSE=1 (or pass --verbose) to enable diagnostic output. Logs are written as JSONL to {os.tmpdir()}/pi-research-debug-{hash}.log — one file per session, never written to stdout. Each log line includes a session correlation ID so multiple concurrent sessions are distinguishable. The log path is printed to stderr on startup when verbose mode is active.


Development & Release

Commands

  • npm run lint / npm run lint:fix: Code quality
  • npm run type-check: TypeScript verification
  • npm run test:unit: Unit tests — no dependencies
  • npm run test:integration: Integration tests — requires Docker; uses Testcontainers to spin up a real SearXNG instance (runs locally and in CI)

Release

  1. npm version patch (or minor/major)
  2. git push origin main --tags The GitHub Action will automatically publish to npm.

Architecture

See docs/ARCHITECTURE.md for detailed design information.