npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2025 – Pkg Stats / Ryan Hefner

@darkvisitors/sdk

v1.6.0

Published

The official Node.js SDK for Dark Visitors: AI agent analytics (bot traffic), LLM referral tracking, automatic robots.txt

Readme

Dark Visitors SDK

NPM version npm bundle size

This library provides convenient access to Dark Visitors from server-side TypeScript or JavaScript.

Install the Package

Download and include the package via NPM:

npm install @darkvisitors/sdk

Initialize the Client

Sign up for Dark Visitors, create a project, and copy your access token from the project's settings page. Then, create a new instance of DarkVisitors.

import { DarkVisitors } from "@darkvisitors/sdk"

const darkVisitors = new DarkVisitors("YOUR_ACCESS_TOKEN")

How To Set Up Agent & LLM Analytics (Full Docs)

Get realtime insight into the hidden ecosystem of crawlers, scrapers, AI agents, and other bots browsing your website. Measure human traffic coming from AI chat and search platforms like ChatGPT, Perplexity, and Gemini.

To collect this data, call trackVisit for each incoming request in the endpoints where you serve your pages.

darkVisitors.trackVisit(incomingRequest)

Use Middleware if Possible

If you can, add this in middleware to track incoming requests to all pages from a single place.

Here's an example with Express, but you can apply this same technique with other frameworks:

import express from "express"
import { DarkVisitors } from "@darkvisitors/sdk"

const app = express()
const darkVisitors = new DarkVisitors("YOUR_ACCESS_TOKEN")

app.use((req, res, next) => {
    darkVisitors.trackVisit(req)
    next()
})

app.get("/", (req, res) => {
    res.send("Hello, world!")
})

app.listen(3000, () => console.log("Server running on port 3000"))

Test Your Integration

  • Open your project's settings page
  • Click Send a Test Visit
  • Click Realtime

If your website is correctly connected, you should see visits from the Dark Visitor agent in the realtime timeline within a few seconds.

How To Set Up Automatic Robots.txt (Full Docs)

Protect sensitive content from unwanted access and scraping. Generate a continuously updating robots.txt that stays up to date with all current and future bots in the specified categories automatically.

Use the generateRobotsTxt function. Select which AgentTypes you want to block, and a string specifying which URLs are disallowed (e.g. "/" to disallow all paths).

const robotsTxt = await darkVisitors.generateRobotsTxt([
  AgentType.AIDataScraper,
  AgentType.Scraper,
  AgentType.IntelligenceGatherer,
  AgentType.SEOCrawler
  // ...
], "/")

The return value is a plain text robots.txt string. Generate a robotsTXT periodically (e.g. once per day), then cache and serve it from your website's /robots.txt endpoint.

Requirements

TypeScript >= 4.7 is supported.

The following runtimes are supported:

  • Node.js 18 LTS or later (non-EOL) versions.

Support

Please open an issue with questions, bugs, or suggestions.