hallucination-validator
v1.2.0
Published
Validates AI outputs for linkrot, dangerous code, and hallucinations.
Maintainers
Readme
hallucination-validator

AI Output Validator for Security & Fact-Checking
hallucination-validator is a comprehensive library for validating Large Language Model (LLM) outputs. It prevents common AI risks such as link hallucination (linkrot/hijacking), dangerous code generation, and context fabrication.
Why use this?
LLMs are confident but often incorrect. Security risks arise when:
- Hallucinated URLs point to non-existent domains that can be hijacked by attackers.
- Generated Code contains dangerous patterns like
eval()orexec(). - Fabricated Quotes mislead users by misrepresenting source material.
Features
- Link Integrity: Extracts URLs and performs async HEAD requests to verify
200 OKstatus. - Code Safety: Scans generated code for unsafe Node.js patterns (
eval,child_process,document.write). - Fuzzy Quote Verification: Verifies if a quoted string exists in the source text, tolerant of minor AI alterations or typos.
Installation
npm install hallucination-validatorUsage
import HallucinationValidator from 'hallucination-validator';
const validator = new HallucinationValidator();
// 1. Validate Links
const text = "Check out valid.com and broken.link/404";
validator.validateLinks(text).then(broken => {
console.log('Broken Links:', broken);
// [{ url: 'http://broken.link/404', status: 404, ... }]
});
// 2. Scan Code
const script = "function run() { eval(input); }";
const risks = validator.scanCodeSafety(script);
console.log('Risks:', risks);
// ['eval()']
// 3. Verify Quotes
const source = "The quick brown fox jumps over the lazy dog.";
const aiQuote = "The quick brown fox jumped over a lazy cat.";
const isValid = validator.verifyQuote(aiQuote, source);
console.log('Quote Valid:', isValid);
// falseDependencies
fast-levenshtein: For robust string comparison algorithms.- Node.js 18+: Requires native
fetchAPI.
License
MIT © Godfrey Lebo
