@threatvectorsecurity/contextcypher
v1.9.6
Published
ContextCypher - AI-Powered Threat Modeling Platform
Maintainers
Readme
ContextCypher
AI-Powered Threat Modeling and Security Analysis Platform
Overview
ContextCypher is a privacy-focused threat modeling platform that helps security professionals and developers create architecture diagrams and generate AI-powered security analysis. It is offline-first with local Ollama support, and can also connect to cloud AI providers.
Key Features
Free version
- Full diagram creation and editing
- Node types and security zones
- Chat-based diagram analysis
- Save/load diagrams (JSON)
- Local Ollama offline analysis
- OpenAI/Anthropic/Gemini provider support
- 3D diagram visualization
- Comprehensive GRC module with integrated diagram attack paths
Pro version
- AI diagram generation from text
- Advanced threat analysis with MITRE ATT&CK
- Premium themes and effects
- Automatic data sanitization
- Compliance logging and audit trails
- Full conversation history
Installation
Prerequisites
- Node.js 18.x or later
- npm 10.8.2 or later
Install via npm
# Install globally
npm install -g @threatvectorsecurity/contextcypher
# Or with yarn
yarn global add @threatvectorsecurity/contextcypherUsage
Start the application
contextcypherThe app will start the backend server, then open your browser automatically unless disabled.
First-time setup
- Open Settings.
- Choose AI provider:
- Ollama (recommended for offline use)
- OpenAI (API key required)
- Anthropic Claude (API key required)
- Google Gemini (API key required)
Install Ollama (offline AI)
# macOS/Linux
curl -fsSL https://ollama.com/install.sh | sh
# Pull a model
ollama pull llama3.2Windows installer: https://ollama.com/download
CLI options
# Start without opening browser
contextcypher --no-browser# Custom port (macOS/Linux)
PORT=3003 contextcypher# Custom port (Windows PowerShell)
$env:PORT='3003'; contextcypherSystem Requirements
Minimum
- OS: Windows 10+, macOS 10.15+, Ubuntu 20.04+
- RAM: 4GB (8GB recommended)
- Storage: 1GB free
- Browser: Chrome 95+, Firefox 91+, Safari 15.4+, Edge 95+
Local AI (Ollama)
- Additional RAM: 8-16GB
- Additional storage: 10-20GB for models
- GPU: 8GB+ VRAM recommended
Troubleshooting
Port already in use
# macOS/Linux
lsof -i :3001
# Windows
netstat -ano | findstr :3001Then stop the process or run on a different port.
Ollama connection failed
# Ensure Ollama is running
ollama serve
# Check API
curl http://localhost:11434/api/tagsPermission denied (macOS/Linux)
mkdir ~/.npm-global
npm config set prefix '~/.npm-global'
echo 'export PATH=~/.npm-global/bin:$PATH' >> ~/.bashrc
source ~/.bashrcLogs
When running in production mode, logs are written to:
- Windows:
%LOCALAPPDATA%\\ContextCypher\\logs\\ - macOS:
~/Library/Application Support/ContextCypher/logs/ - Linux:
~/.local/share/ContextCypher/logs/
Updates
# Update to latest
npm update -g @threatvectorsecurity/contextcypher
# Reinstall
npm uninstall -g @threatvectorsecurity/contextcypher
npm install -g @threatvectorsecurity/contextcypherSupport
- Discord: https://discord.com/invite/Ve7gbf2ytc
- Website: https://threatvectorsecurity.com
- Documentation: https://threatvectorsecurity.com
- Support and issues: https://threatvectorsecurity.com/contact/
License
ContextCypher is proprietary software. By using this software, you agree to the applicable Terms of Service.
Security
For security issues, use the contact form and mark the report as a security issue: https://threatvectorsecurity.com/contact/
Legal Notice
No warranty. This software is provided "AS IS" without warranty of any kind. AI-generated content must be validated by qualified security professionals.
Keywords
threat-modeling, security, ai, diagram, architecture, ContextCypher
