npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

meta-surfer

v0.3.0

Published

AI-powered web search engine — CLI, library, and optional web UI. Self-hosted Perplexity alternative with multi-provider LLM support.

Readme

Meta Surfer

A self-hosted, AI-powered web search engine. An open-source alternative to Perplexity that runs entirely on your own infrastructure.

npm version License: MIT Node.js

Features

  • Multi-provider LLM support -- OpenAI, Google Gemini, Anthropic Claude, xAI Grok, and Z.AI out of the box, with automatic provider detection
  • Self-hosted search -- SearXNG meta search engine for private, untracked web queries
  • Three interfaces -- CLI tool, importable Node.js library, and optional Next.js web UI
  • Deep research mode -- Autonomous multi-step research that plans queries, searches, scrapes pages, and synthesizes answers
  • Code execution -- Sandboxed code running via Piston for calculations and data analysis
  • Streaming -- Real-time streamed responses in both CLI and library modes

Quick Start

# 1. Clone and install
git clone https://github.com/hun-meta/meta-surfer.git
cd meta-surfer
npm install

# 2. Configure your LLM provider
cp .env.example .env.local
# Edit .env.local — set at least one API key (e.g. OPENAI_API_KEY)

# 3. Start external services
docker compose up -d

# 4. Ask your first question
npx tsx src/cli.ts ask "What is the current population of Tokyo?"

Installation

# As a project dependency
npm install meta-surfer

# Global install for CLI usage
npm install -g meta-surfer

After global install, the meta-surfer command is available system-wide.

Configuration

Copy the example environment file and set your API keys:

cp .env.example .env.local
# Pick a provider — only one API key is required.
# The provider is auto-detected from whichever key is set.
OPENAI_API_KEY=sk-...

# External services (defaults match docker-compose.yml)
SEARXNG_URL=http://localhost:8080
CRAWL4AI_URL=http://localhost:11235
PISTON_URL=http://localhost:2000

You can also set LLM_PROVIDER and LLM_MODEL explicitly to override auto-detection. See docs/providers.md for full configuration details.

Usage

CLI

# AI-powered search
meta-surfer ask "How does photosynthesis work?"

# Deep research mode
meta-surfer ask "Compare React and Vue in 2025" --mode extreme

# Stream the response
meta-surfer ask "Latest news on SpaceX" --stream

# Raw web search (no AI)
meta-surfer search "TypeScript 5.7 release notes" -n 5

# Scrape a web page
meta-surfer scrape https://example.com --json

# Execute code in a sandbox
meta-surfer execute python -c "print(sum(range(100)))"

# Deep autonomous research
meta-surfer research "Impact of AI on healthcare" --ai

# Start the web UI
meta-surfer serve --port 3000

Global options (--provider, --base-url, --api-key, --model, --searxng, --crawl4ai, --piston) can be passed before any command to override environment configuration.

Library

import { configure, ask, chat } from "meta-surfer";

// Configure the provider (or rely on env vars)
configure({
  provider: "openai",
  apiKey: process.env.OPENAI_API_KEY,
});

// Simple question — returns the full answer
const answer = await ask({ query: "What is quantum computing?" });
console.log(answer);

// Streaming chat
const result = chat({
  messages: [{ role: "user", content: "Explain REST vs GraphQL" }],
  mode: "web",
});

for await (const chunk of result.textStream) {
  process.stdout.write(chunk);
}

The library also exports individual tools (searchMultiQuery, scrapeUrls, executeCode, extremeSearch) for lower-level usage.

Web UI

An optional Next.js 14 web interface is included.

# Development
npm run dev

# Production
npm run build:web && npm start

# Or via Docker
docker compose --profile web up

Supported Providers

| Provider | Default Model | Env Var | |------------|------------------------|---------------------| | OpenAI | gpt-5.1-chat-latest | OPENAI_API_KEY | | Google | gemini-3.1-flash | GOOGLE_API_KEY | | Anthropic | claude-opus-4.6 | ANTHROPIC_API_KEY | | xAI | grok-4.1 | XAI_API_KEY | | Z.AI | glm-5 | ZAI_API_KEY |

Set one or more API keys in your .env.local. The provider is auto-detected from whichever key is present (checked in the order listed above). Override with LLM_PROVIDER and LLM_MODEL if needed.

Architecture

Meta Surfer chains three self-hosted services with an LLM to answer questions:

User Query
    |
    v
 SearXNG  ------>  Crawl4AI  ------>  LLM  ------>  Answer
 (search)          (scrape)          (synthesize)
    |
    v
 Piston (optional code execution)

External services:

| Service | Required? | Description | Fallback | |----------|-----------|-------------|----------| | SearXNG | Required | Meta search engine for web queries | None -- search fails without it | | Crawl4AI | Recommended | Web page scraper for detailed content | Built-in HTML fetcher (enhancedFetch) | | Piston | Optional | Sandboxed code execution | Code execution unavailable; search still works |

These services communicate over HTTP. The included docker-compose.yml is the easiest way to run them locally, but you can also host them on a remote server or use any other deployment method -- Meta Surfer only needs their URLs.

The core engine (src/engine.ts) orchestrates these services as LLM tool calls using the Vercel AI SDK, allowing the model to decide when to search, read pages, or run code.

Documentation

| Guide | Description | |-------|-------------| | Getting Started | Prerequisites, installation, and first query | | CLI Guide | Full command reference with examples | | Library Guide | Programmatic usage, API reference, integrations | | Providers | LLM provider configuration and auto-detection | | Web UI Guide | Next.js web interface setup | | Docker Setup | External services configuration and troubleshooting | | Commit Convention | Git commit message format |

Contributing

Contributions are welcome. See CONTRIBUTING.md for guidelines.

License

MIT -- Copyright (c) hun-meta