npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@ctrlpnl/node

v0.2.3

Published

Official Node.js SDK for Ctrlpnl - AI Pipeline Protection

Downloads

65

Readme

@ctrlpnl/node

Official Node.js SDK for Ctrlpnl - AI Pipeline Protection.

Protect your AI applications with PII redaction, secret removal, prompt injection blocking, and content filtering.

Installation

npm install @ctrlpnl/node
# or
pnpm add @ctrlpnl/node
# or
yarn add @ctrlpnl/node

Quick Start

import Ctrlpnl from "@ctrlpnl/node";

const ctrlpnl = new Ctrlpnl({
  apiToken: process.env.CTRLPNL_API_TOKEN,
});

// Before calling your AI
const input = await ctrlpnl.transform(userPrompt);

if (input.blocked) {
  return "Sorry, I can't help with that.";
}

// Call your AI with the safe prompt
const aiResponse = await openai.chat.completions.create({
  model: "gpt-4",
  messages: [{ role: "user", content: input.prompt }],
});

// After getting the response
const output = await ctrlpnl.complete(
  input.traceId,
  aiResponse.choices[0].message.content
);

return output.response;

Features

  • PII Redaction: Automatically detect and redact emails, phone numbers, SSNs, credit cards
  • Secret Detection: Remove API keys, tokens, and passwords before they reach your AI
  • Prompt Injection Blocking: Detect and block prompt injection attempts
  • Content Filtering: Custom rules to block or modify unwanted content
  • Policy Inheritance: Global policies that apply across all your pipelines
  • Full Audit Trail: Every transformation is traced for compliance

API Reference

new Ctrlpnl(config)

Create a new Ctrlpnl client.

const ctrlpnl = new Ctrlpnl({
  // Required: Your API token from https://ctrlpnl.ai/settings/api-keys
  apiToken: "cp_live_...",

  // Optional: Request timeout in ms (default: 30000)
  timeout: 30000,

  // Optional: Default policy ID for all requests
  defaultPolicyId: "my-policy",

  // Optional: Enable debug logging
  debug: false,
});

ctrlpnl.transform(prompt, options?)

Transform a prompt through your input pipeline before sending to AI.

const result = await ctrlpnl.transform("My email is [email protected]", {
  // Optional: Override the default policy
  policyId: "strict-policy",

  // Optional: Context for policy evaluation
  context: {
    userId: "user_123",
    sessionId: "session_456",
    environment: "production",
    metadata: { role: "admin" },
  },
});

console.log(result.prompt); // "My email is [REDACTED]"
console.log(result.blocked); // false
console.log(result.traceId); // "abc-123-..."
console.log(result.appliedSteps); // [{ stepName: "Redact PII", applied: true, ... }]

ctrlpnl.complete(traceId, response, options?)

Complete a trace by processing the AI response through your output pipeline.

const output = await ctrlpnl.complete(input.traceId, aiResponse, {
  // Optional: AI metadata for observability
  aiMetadata: {
    model: "gpt-4",
    provider: "openai",
    tokenCount: 150,
    latencyMs: 1200,
  },
});

console.log(output.response); // Safe response to return to user
console.log(output.blocked); // false
console.log(output.appliedSteps); // Output pipeline steps that ran

Utilities

wrap()

Wrap an AI call with full pipeline protection in one function.

import Ctrlpnl, { wrap } from "@ctrlpnl/node";

const ctrlpnl = new Ctrlpnl();

const result = await wrap(
  ctrlpnl,
  userPrompt,
  async (safePrompt) => {
    const response = await openai.chat.completions.create({
      model: "gpt-4",
      messages: [{ role: "user", content: safePrompt }],
    });
    return response.choices[0].message.content;
  },
  {
    context: { userId: "user_123" },
    onBlocked: (info) => {
      console.log(`Blocked at ${info.phase}: ${info.reason}`);
      return "Sorry, I can't help with that.";
    },
  }
);

console.log(result.value); // The safe AI response (or onBlocked return value)
console.log(result.blocked); // Whether it was blocked
console.log(result.blockedAt); // "input" | "output" | undefined

prepareStream()

Prepare for streaming AI responses.

import Ctrlpnl, { prepareStream } from "@ctrlpnl/node";

const ctrlpnl = new Ctrlpnl();

const { safePrompt, traceId, complete } = await prepareStream(
  ctrlpnl,
  userPrompt
);

// Start streaming
const stream = await openai.chat.completions.create({
  model: "gpt-4",
  messages: [{ role: "user", content: safePrompt }],
  stream: true,
});

// Collect and stream to user
let fullResponse = "";
for await (const chunk of stream) {
  const content = chunk.choices[0]?.delta?.content || "";
  fullResponse += content;
  process.stdout.write(content);
}

// Complete the trace
const output = await complete(fullResponse, {
  model: "gpt-4",
  provider: "openai",
});

// Check if output was blocked
if (output.blocked) {
  console.log("Response was filtered:", output.blockReason);
}

batchTransform()

Transform multiple prompts in parallel.

import Ctrlpnl, { batchTransform } from "@ctrlpnl/node";

const ctrlpnl = new Ctrlpnl();

const { results, errors } = await batchTransform(
  ctrlpnl,
  ["prompt 1", "prompt 2", "prompt 3"],
  { context: { userId: "batch_user" } },
  5 // concurrency
);

console.log(`Transformed ${results.length} prompts`);
console.log(`Failed: ${errors.length}`);

Error Handling

import Ctrlpnl, {
  BlockedError,
  AuthenticationError,
  RateLimitError,
} from "@ctrlpnl/node";

try {
  const result = await ctrlpnl.transform(prompt);
} catch (error) {
  if (error instanceof BlockedError) {
    console.log("Request blocked:", error.message);
    console.log("Trace ID:", error.traceId);
    console.log("Blocking step:", error.blockingStep);
  } else if (error instanceof AuthenticationError) {
    console.log("Invalid API token");
  } else if (error instanceof RateLimitError) {
    console.log("Rate limited, retry after:", error.retryAfter);
  }
}

Environment Variables

The SDK reads these environment variables:

  • CTRLPNL_API_TOKEN - Your API token (can also be passed to constructor)

TypeScript

Full TypeScript support with exported types:

import type {
  CtrlpnlConfig,
  TransformResult,
  CompleteResult,
  AppliedStepInfo,
  PipelineContext,
} from "@ctrlpnl/node";

License

MIT