@vantagesec/socc
v0.1.17
Published
Security operations copiloto for threat intelligence, incident response, and agentic investigation
Downloads
1,198
Maintainers
Readme
SOCC
SOCC is an open-source security operations copiloto.
It is designed for SOC workflows first: threat intelligence, suspicious artifact triage, investigation support, and incident response. The current runtime keeps the existing multi-provider agentic foundation so the same terminal-first workflow can still use prompts, tools, agents, MCP, slash commands, and streaming output when those capabilities help the analyst.
Quick Start | Setup Guides | Providers | Source Build | VS Code Extension | Community
Why SOCC
- Start from a security-first CLI for SOC, threat intel, and incident response work
- Use one runtime across cloud APIs and local model backends
- Keep agentic workflows available for investigation support: prompts, tools, agents, tasks, MCP, and streaming output
- Work with OpenAI-compatible services, Gemini, GitHub Models, Codex, Ollama, Atomic Chat, and other supported providers
- Preserve the productivity of the current terminal workflow while shifting the product identity toward security operations
Quick Start
Install
npm install -g @vantagesec/soccIf the install later reports ripgrep not found, install ripgrep system-wide and confirm rg --version works in the same terminal before starting SOCC.
Start
soccInside SOCC:
- run
/providerfor guided provider setup and saved profiles - run
/onboard-githubfor GitHub Models onboarding - start with a payload, alert, URL, log excerpt, or investigative question when using SOCC as a security analyst copiloto
Fastest OpenAI setup
macOS / Linux:
export SOCC_USE_OPENAI=1
export OPENAI_API_KEY=sk-your-key-here
export OPENAI_MODEL=gpt-4o
soccWindows PowerShell:
$env:SOCC_USE_OPENAI="1"
$env:OPENAI_API_KEY="sk-your-key-here"
$env:OPENAI_MODEL="gpt-4o"
soccFastest local Ollama setup
macOS / Linux:
export SOCC_USE_OPENAI=1
export OPENAI_BASE_URL=http://localhost:11434/v1
export OPENAI_MODEL=qwen2.5-coder:7b
soccWindows PowerShell:
$env:SOCC_USE_OPENAI="1"
$env:OPENAI_BASE_URL="http://localhost:11434/v1"
$env:OPENAI_MODEL="qwen2.5-coder:7b"
soccSetup Guides
Beginner-friendly guides:
Advanced and source-build guides:
Supported Providers
| Provider | Setup Path | Notes |
| --- | --- | --- |
| OpenAI-compatible | /provider or env vars | Works with OpenAI, OpenRouter, DeepSeek, Groq, Mistral, LM Studio, and other compatible /v1 servers |
| Gemini | /provider or env vars | Supports API key, access token, or local ADC workflow on current main |
| GitHub Models | /onboard-github | Interactive onboarding with saved credentials |
| Codex | /provider | Uses existing Codex credentials when available |
| Ollama | /provider or env vars | Local inference with no API key |
| Atomic Chat | advanced setup | Local Apple Silicon backend |
| Bedrock / Vertex / Foundry | env vars | Additional provider integrations for supported environments |
What Works
- Tool-driven coding workflows: Bash, file read/write/edit, grep, glob, agents, tasks, MCP, and slash commands
- Streaming responses: Real-time token output and tool progress
- Tool calling: Multi-step tool loops with model calls, tool execution, and follow-up responses
- Images: URL and base64 image inputs for providers that support vision
- Provider profiles: Guided setup plus persisted profile support (legacy filename still appears in some runtime paths)
- Local and remote model backends: Cloud APIs, local servers, and Apple Silicon local inference
Provider Notes
SOCC supports multiple providers, but behavior is not identical across all of them.
- Anthropic-specific features may not exist on other providers
- Tool quality depends heavily on the selected model
- Smaller local models can struggle with long multi-step tool flows
- Some providers impose lower output caps than the CLI defaults, and SOCC adapts where possible
For best results, use models with strong tool/function calling support.
Agent Routing
SOCC can route different agents to different models through settings-based routing. This is useful for cost optimization or splitting work by model strength.
Add to ~/.socc/settings.json:
{
"agentModels": {
"deepseek-chat": {
"base_url": "https://api.deepseek.com/v1",
"api_key": "sk-your-key"
},
"gpt-4o": {
"base_url": "https://api.openai.com/v1",
"api_key": "sk-your-key"
}
},
"agentRouting": {
"Explore": "deepseek-chat",
"Plan": "gpt-4o",
"general-purpose": "gpt-4o",
"frontend-dev": "deepseek-chat",
"default": "gpt-4o"
}
}When no routing match is found, the global provider remains the fallback.
Note:
api_keyvalues insettings.jsonare stored in plaintext. Keep this file private and do not commit it to version control.
Web Search and Fetch
By default, WebSearch works on non-Anthropic models using DuckDuckGo. This gives GPT-4o, DeepSeek, Gemini, Ollama, and other OpenAI-compatible providers a free web search path out of the box.
Note: DuckDuckGo fallback works by scraping search results and may be rate-limited, blocked, or subject to DuckDuckGo's Terms of Service. If you want a more reliable supported option, configure Firecrawl.
For Anthropic-native backends and Codex responses, SOCC keeps the native provider web search behavior.
WebFetch works, but its basic HTTP plus HTML-to-markdown path can still fail on JavaScript-rendered sites or sites that block plain HTTP requests.
Set a Firecrawl API key if you want Firecrawl-powered search/fetch behavior:
export FIRECRAWL_API_KEY=your-key-hereWith Firecrawl enabled:
WebSearchcan use Firecrawl's search API while DuckDuckGo remains the default free path for non-Claude modelsWebFetchuses Firecrawl's scrape endpoint instead of raw HTTP, handling JS-rendered pages correctly
Free tier at firecrawl.dev includes 500 credits. The key is optional.
Headless gRPC Server
SOCC can be run as a headless gRPC service, allowing you to integrate its capabilities into other applications, CI/CD pipelines, or custom user interfaces. The server uses bidirectional streaming to send real-time text chunks, tool calls, and request permissions for sensitive commands.
1. Start the gRPC Server
Start the core engine as a gRPC service on localhost:50051:
npm run dev:grpcConfiguration
| Variable | Default | Description |
|-----------|-------------|------------------------------------------------|
| GRPC_PORT | 50051 | Port the gRPC server listens on |
| GRPC_HOST | localhost | Bind address. Use 0.0.0.0 to expose on all interfaces (not recommended without authentication) |
2. Run the Test CLI Client
We provide a lightweight CLI client that communicates exclusively over gRPC. It acts just like the main interactive CLI, rendering colors, streaming tokens, and prompting you for tool permissions (y/n) via the gRPC action_required event.
In a separate terminal, run:
npm run dev:grpc:cliNote: The active gRPC definitions now live in src/proto/socc.proto.
Source Build And Local Development
bun install
bun run build
node dist/cli.mjsHelpful commands:
bun run devbun testbun run test:coveragebun run security:pr-scan -- --base origin/mainbun run smokebun run doctor:runtimebun run verify:privacy- focused
bun test ...runs for the areas you touch
Testing And Coverage
SOCC uses Bun's built-in test runner for unit tests.
Run the full unit suite:
bun testGenerate unit test coverage:
bun run test:coverageOpen the visual coverage report:
open coverage/index.htmlIf you already have coverage/lcov.info and only want to rebuild the UI:
bun run test:coverage:uiUse focused test runs when you only touch one area:
bun run test:providerbun run test:provider-recommendationbun test path/to/file.test.ts
Recommended contributor validation before opening a PR:
bun run buildbun run smokebun run test:coveragefor broader unit coverage when your change affects shared runtime or provider logic- focused
bun test ...runs for the files and flows you changed
Coverage output is written to coverage/lcov.info, and SOCC also generates a git-activity-style heatmap at coverage/index.html.
Repository Structure
src/- core CLI/runtimescripts/- build, verification, and maintenance scriptsdocs/- setup, contributor, and project documentationvscode-extension/socc-vscode/- VS Code extension package for SOCC.github/- repo automation, templates, and CI configurationbin/- CLI launcher entrypoints
VS Code Extension
The repo includes a VS Code extension package in vscode-extension/socc-vscode for SOCC launch integration, provider-aware control-center UI, and theme support.
Security
If you believe you found a security issue, see SECURITY.md.
Community
- Use GitHub Discussions for Q&A, ideas, and community conversation
- Use GitHub Issues for confirmed bugs and actionable feature work
Contributing
Contributions are welcome.
For larger changes, open an issue first so the scope is clear before implementation. Helpful validation commands include:
bun run buildbun run test:coveragebun run smoke- focused
bun test ...runs for touched areas
Disclaimer
SOCC is an independent community project and is not affiliated with, endorsed by, or sponsored by Anthropic.
SOCC is based on the SOCC codebase, which originated from the Claude Code codebase and was later extended for broader provider support and open use. "Claude" and "Claude Code" are trademarks of Anthropic PBC. See LICENSE for details.
License
See LICENSE.
