npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

safeguard-ai

v1.0.1

Published

AI content moderation, PII detection, and safety toolkit for developers. Filter toxic content, detect personal information, and ensure GDPR compliance in your AI applications.

Readme

🛡️ SafeguardAI - AI Content Moderation & Safety Toolkit

npm version Downloads License: MIT

The complete AI content moderation solution for developers. Protect your AI applications from toxic content, detect PII automatically, and ensure compliance with GDPR, COPPA, and HIPAA.

Perfect for: ChatGPT apps, Claude integrations, AI chatbots, user-generated content platforms, and LLM applications.

⚡ Quick Start

npm install safeguard-ai
const SafeguardAI = require('safeguard-ai');

const moderator = new SafeguardAI();

// Basic usage (requires API key unless using mock)
const result = await moderator.checkText("Check this text for safety");

if (!result.safe) {
  console.log('Content flagged:', result.categories);
}

📖 Documentation

🚀 Features

Text Moderation - Detect toxicity, hate speech, violence, sexual content
PII Detection - Find and redact emails, phones, SSN, credit cards
Multi-Provider Support - Currently supports OpenAI, with more coming soon
Custom Rules - Add your own blocked words and patterns
GDPR Compliance - Automatic PII detection for EU compliance
TypeScript - Full TypeScript support included

🌟 The SafeguardAI Advantage

What sets SafeguardAI apart from standard moderation tools:

  • All-in-One Safety Stack: Why use three different libraries for PII, Moderation, and Custom Rules? SafeguardAI unifies them into a single, high-performance toolkit.
  • Hybrid Processing: Combines cutting-edge AI (for context-aware moderation) with optimized local patterns (for lightning-fast PII detection).
  • Privacy-First Design: PII detection and redaction happen locally when possible, ensuring sensitive data never reaches external APIs unless you want it to.
  • Zero-Config Start: Get up and running in seconds with sensible defaults, then scale to complex enterprise requirements with custom rules.
  • Future-Proof: Built with a provider-agnostic architecture. Switch between OpenAI, Perspective, or Azure with minimal code changes.

💎 How It Helps You

SafeguardAI isn't just a library; it's a protector for your users and your business:

  • 🛡️ Protect Your Community: Automatically filter out toxic, hateful, or violent content in real-time, creating a safer space for your users.
  • 🔒 Ensure Legal Compliance: Effortlessly meet GDPR, HIPAA, and PCI-DSS requirements by catching sensitive personal data before it's stored or leaked.
  • 📉 Reduce Operational Costs: Minimize the need for expensive human moderation teams by automating 99% of regular content checks.
  • 🤝 Build User Trust: Show your users that you take their safety and privacy seriously by implementing transparent content safeguards.
  • 🚀 Accelerate AI Development: Focus on building your core AI features while we handle the complex logic of content safety and redaction.

Check Text Safety

const result = await moderator.checkText("Your text here");
/* Result:
{
  safe: false,
  flagged: true,
  categories: {
    toxicity: { detected: true, score: 0.89, severity: 'high' }
  },
  piiDetected: [
    { type: 'email', value: '[email protected]', position: [45, 60] }
  ],
  cleanText: "Your text with [REDACTED] instead of PII",
  suggestions: ["Content flagged by OpenAI moderation.", "PII detected in text."]
}
*/

Configure Provider

const moderator = new SafeguardAI({
  apiKey: 'your-openai-api-key',
  providers: ['openai'],
  strictness: 'medium'
});

Add Custom Rules

moderator.rules.addBlockedWords(['badword1', 'badword2']);
moderator.rules.addPattern(/\d{3}-\d{2}-\d{4}/g, 'SSN');

📊 Use Cases

  • AI Chatbots - Filter user messages before processing
  • Content Platforms - Moderate user-generated content
  • Customer Support - Detect PII in support tickets
  • Healthcare Apps - HIPAA compliance for medical data
  • Financial Apps - PCI-DSS compliance for payment info

🔥 Why SafeguardAI?

| Feature | SafeguardAI | Competitors | |---------|-------------|-------------| | Multi-provider support | ✅ | ❌ | | PII Detection | ✅ | Limited | | Custom rules | ✅ | ❌ | | TypeScript | ✅ | ✅ |

🤝 Contributing

Contributions are welcome! Please read our Contributing Guide for details.

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.