@aion0/bastion-plugin-api
v0.1.4
Published
Pro plugin pack for Bastion AI Gateway
Readme
@aion0/bastion-plugin-api
Optional plugin pack for Bastion AI Gateway — adds ML-based prompt injection detection and advanced security features.
Free to use. No license required. This is an optional install that adds heavier dependencies (ONNX Runtime, ML models) to the core gateway.
Features
ML Prompt Injection Classifier
Local ONNX inference using ProtectAI/deberta-v3-base-prompt-injection-v2. Zero cloud dependency — models run entirely on your machine.
- Scans user messages (including tool_result content blocks) before they reach the LLM
- Configurable threshold (default: 0.8) and action (
warn/block) - Emits
pi:detectedevents for Tool Guard escalation - Supports Anthropic and OpenAI message formats
- Model files auto-downloaded from Hugging Face Hub on first start (~436 MB)
Planned
- Content Extractor (PDF text extraction, image OCR)
- PI + Tool Guard linkage
Installation
Via CLI (recommended)
# Install from bastion source directory (auto-detects sibling bastion-plugin-api)
bastion plugins install
# Or specify path explicitly
bastion plugins install /path/to/bastion-plugin-apiVia install.sh
# From the bastion source directory
./install.sh -local -pluginsManual
mkdir -p ~/.bastion/app/plugins/bastion-plugin-api
cp -r /path/to/bastion-plugin-api/* ~/.bastion/app/plugins/bastion-plugin-api/
cd ~/.bastion/app/plugins/bastion-plugin-api
npm install && npm run buildThen add to ~/.bastion/config.yaml:
plugins:
external:
- package: /Users/YOU/.bastion/app/plugins/bastion-plugin-api
enabled: trueUninstall
# CLI (prompts to also remove downloaded models)
bastion plugins uninstall
# Or remove everything including models
bastion plugins uninstall --allYou can also uninstall from the Dashboard → Settings → Optional Features.
Configuration
Plugin config in ~/.bastion/config.yaml:
plugins:
external:
- package: /path/to/bastion-plugin-api
enabled: true
config:
piClassifier:
modelId: protectai/deberta-v3-base-prompt-injection-v2
threshold: 0.8
action: warn # warn | block
scanSystem: false # also scan system messagesArchitecture
src/
├── index.ts # Plugin registration (no license gating)
├── classifier/
│ ├── model-manager.ts # HF Hub model download + caching
│ └── onnx-provider.ts # ONNX Runtime inference
└── plugins/
└── pi-classifier.ts # Prompt injection detection pluginModel Storage
Models are cached at ~/.bastion/models/{model-id}/ with files:
model.onnx— ONNX model (~400 MB for DeBERTa v3 base)tokenizer.json— HuggingFace tokenizerconfig.json— Model configuration (label mapping)
Plugin Lifecycle
register(config)→ creates plugins unconditionallyonInit()→ downloads model if needed → initializes ONNX sessiononRequest()→ extracts user texts → classifies → returns warn/block resultonDestroy()→ releases ONNX session
Development
npm run build
npm testRequirements
- Node.js >= 20.0.0
- Bastion AI Gateway >= 0.1.10
@aion0/bastion-plugin-api>= 0.1.2
License
MIT
