@openbastion-ai/cli
v0.2.3
Published
CLI binary for Bastion
Readme
@openbastion-ai/cli
Command-line interface for Bastion — the open-source AI gateway that secures, governs, and optimizes every LLM API call.
Install
npm install -g @openbastion-ai/cliQuick Start
# Create a config file
bastion init
# Validate configuration
bastion validate
# Start the proxy
bastion start
# Test with a sample request
bastion test
# Check proxy status
bastion statusUsage
Point your LLM client at the Bastion proxy — zero code changes:
export ANTHROPIC_BASE_URL=http://localhost:4000
node your-app.jsWhat Bastion Does
Every request flows through a six-stage middleware pipeline:
Rate Limit → Injection Detection → Policy Check → Cache → Provider → Audit- Rate limiting — Token bucket per IP/agent with LRU eviction
- Injection detection — 12 weighted regex patterns with NFKC normalization
- Policy enforcement — Declarative rules from
bastion.yaml - Caching — SHA-256 keyed, scoped by agent/team/environment
- Provider fallback — Automatic failover across Anthropic, OpenAI, Google, Bedrock
- Audit logging — Full request/response capture with pluggable exporters
Configuration
All settings in bastion.yaml with Zod schema validation and ${ENV_VAR} interpolation:
providers:
primary: anthropic
fallback: [openai, bedrock]
anthropic:
api_key: ${ANTHROPIC_API_KEY}
policies:
- name: block-injection
on: request
if: { injection_score: { gt: 0.8 } }
then: block
cache:
enabled: true
ttl: 3600Related Packages
| Package | Description | |---------|-------------| | @openbastion-ai/proxy | Core proxy server and middleware pipeline | | @openbastion-ai/config | Zod schema and YAML config loader | | @openbastion-ai/sdk | Typed admin API client |
Part of the Trilogy
Forge (define agents) + Bastion (protect traffic) + Lantern (observe traces)
License
MIT
