@epic-cloudcontrol/daemon
v0.6.0
Published
CloudControl local daemon — executes AI agent tasks on worker machines
Downloads
934
Readme
@epic-cloudcontrol/daemon
The local worker for CloudControl — an AI orchestration platform where cloud governance meets local execution. Your machine pulls tasks from the cloud, executes them with any AI model, and reports results back. The cloud never touches your data.
Install
npm install -g @epic-cloudcontrol/daemonRequires Node.js 20+.
Quick Start
# 1. Log in (opens browser — no typing in terminal)
cloudcontrol login
# 2. Install as auto-start service
cloudcontrol install
# 3. Done. Check it's running.
cloudcontrol statusThe daemon now runs in the background, polls all your companies for tasks, and executes them via AI.
How It Works
CLOUD (cloudcontrol.onrender.com)
┌────────────────────────────────┐
│ Task Queue · Secrets Vault │
│ Process Library · Audit Logs │
│ COO Agent · Approvals │
└──────────────┬─────────────────┘
│ HTTPS (outbound only)
│
YOUR MACHINE │
┌──────────────▼─────────────────┐
│ cloudcontrol daemon │
│ ├── Polls for tasks │
│ ├── Claims task (atomic) │
│ ├── Selects AI model │
│ │ ├── Claude Code │
│ │ ├── Gemini CLI │
│ │ ├── GPT-4o (API) │
│ │ ├── Ollama (local) │
│ │ └── Any CLI tool │
│ ├── Executes locally │
│ ├── Submits result to cloud │
│ └── Flushes secrets │
└────────────────────────────────┘Key principles:
- The cloud never executes tasks. It's a task board — stores tasks, routes them, records results.
- Your machine pulls work. Outbound HTTPS only. No open ports. Works behind firewalls and VPNs.
- Secrets never touch the AI. The daemon requests secrets, injects them into tool calls (not the model prompt), and flushes after each task.
- Any AI model. Claude, GPT-4o, Gemini, Ollama, or any CLI tool. The cloud doesn't know or care which model ran.
Commands
Account
| Command | Description |
|---------|-------------|
| cloudcontrol login | Log in via browser (device code flow) |
| cloudcontrol whoami | Show current identity and connection status |
| cloudcontrol profiles | List all saved company profiles |
| cloudcontrol switch <profile> | Switch active company |
| cloudcontrol refresh | Discover new companies (no password needed) |
| cloudcontrol logout <profile> | Remove a company profile |
| cloudcontrol logout --all | Remove all profiles |
Tasks
| Command | Description |
|---------|-------------|
| cloudcontrol tasks | List tasks (default: all statuses) |
| cloudcontrol tasks --status pending | Filter by status |
| cloudcontrol tasks --id <uuid> | View task details + result |
| cloudcontrol tasks --create | Create a task interactively |
Daemon
| Command | Description |
|---------|-------------|
| cloudcontrol start | Start daemon (foreground, one company) |
| cloudcontrol start --all | Start daemon (foreground, all companies) |
| cloudcontrol install | Install as auto-start service (runs --all) |
| cloudcontrol uninstall | Remove auto-start service |
| cloudcontrol status | Show config, connection, models, service status |
| cloudcontrol logs | View daemon logs |
| cloudcontrol logs --follow | Tail logs in real-time |
| cloudcontrol logs --errors | View error log |
Models
| Command | Description |
|---------|-------------|
| cloudcontrol models | List available AI models |
| cloudcontrol mcp | Start MCP server for Claude Desktop |
Multi-Company
The daemon supports multiple companies from one machine:
# Login syncs all your companies automatically
cloudcontrol login
# See them
cloudcontrol profiles
# Switch the active default
cloudcontrol switch epic-design-labs
# Run all at once (one poll loop per company)
cloudcontrol start --all
# Or run one specific company
cloudcontrol start --profile argus-reportWhen installed as a service (cloudcontrol install), it runs all companies automatically.
New companies added in the dashboard are discovered on next restart, or manually:
cloudcontrol refreshAI Models
The daemon auto-detects available models and routes tasks to the best match.
Supported Providers
| Provider | Models | Setup |
|----------|--------|-------|
| Claude API | claude-sonnet, claude-haiku | export ANTHROPIC_API_KEY=sk-ant-... |
| OpenAI API | gpt-4o, gpt-4o-mini | export OPENAI_API_KEY=sk-... |
| Google API | gemini-pro, gemini-flash | export GOOGLE_API_KEY=... |
| Claude Code | claude-code | Install: npm i -g @anthropic-ai/claude-code |
| Gemini CLI | gemini | Install: npm i -g @google/gemini-cli |
| Ollama | llama3, mistral, etc. | export CLOUDCONTROL_OLLAMA_MODELS=llama3,mistral |
| Custom CLI | any | export CLOUDCONTROL_CLI_MODELS="name:binary:args" |
Task Routing
Each task has a modelHint that controls which model executes it:
| Hint | What Runs | Use Case |
|------|-----------|----------|
| auto (default) | First local CLI, or cheapest API | General tasks |
| smartest | claude-sonnet, gpt-4o, gemini-pro | Complex reasoning |
| cheapest | claude-haiku, gpt-4o-mini, gemini-flash | High-volume tasks |
| fastest | claude-haiku | Time-sensitive |
| code | Code-optimized model | Development tasks |
| local | CLI tools or Ollama | Data stays on machine |
| gpt-4o | Exact model match | Specific model needed |
Check available models:
cloudcontrol modelsClaude Desktop Integration (MCP)
Add to your Claude Desktop config:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%\Claude\claude_desktop_config.json
{
"mcpServers": {
"cloudcontrol": {
"command": "cloudcontrol",
"args": ["mcp"]
}
}
}For a specific company:
{
"mcpServers": {
"cloudcontrol": {
"command": "cloudcontrol",
"args": ["mcp", "--profile", "my-company"]
}
}
}This gives Claude Desktop 5 tools: cloudcontrol (account/COO), cloudcontrol_tasks, cloudcontrol_work, cloudcontrol_team, cloudcontrol_search.
Auto-Start
cloudcontrol install creates a platform-specific background service:
| Platform | Mechanism | Config Location |
|----------|-----------|----------------|
| macOS | LaunchAgent | ~/Library/LaunchAgents/com.cloudcontrol.daemon.plist |
| Linux | systemd user service | ~/.config/systemd/user/cloudcontrol-daemon.service |
| Windows | Scheduled Task | CloudControlDaemon |
cloudcontrol install # Install and start
cloudcontrol status # Check "Daemon service: Running (PID ...)"
cloudcontrol logs # View output
cloudcontrol uninstall # RemoveLogs are stored at ~/.cloudcontrol/logs/.
Security
Secrets Never Touch the AI
The daemon requests secrets from the cloud vault, injects them into tool calls (Playwright, HTTP requests), and flushes them from memory after each task. The AI model never sees raw credentials.
Sandbox Restrictions
Tasks execute with:
- Blocked commands: rm -rf, sudo, dd, chmod 777, and 20+ dangerous patterns
- Blocked paths: /etc/shadow, ~/.ssh, ~/.aws, /root, /proc
- Environment filtering: Only safe vars (HOME, PATH, SHELL, NODE_ENV) passed to processes
- Output limits: 1MB max per task
- Isolated temp dirs: Per-task
/tmp/cloudcontrol-<id>cleaned up after execution
Pull-Based Architecture
Your machine only makes outbound HTTPS requests. No open ports, no inbound connections, no firewall rules needed. Same security model as GitHub Actions runners.
Configuration
All config stored in ~/.cloudcontrol/:
~/.cloudcontrol/
├── config.json # Default (active) profile
├── profiles/
│ ├── epic-design-labs.json
│ ├── argus-report.json
│ └── ...
└── logs/
├── daemon.log # stdout
└── daemon.err # stderrProfile format:
{
"apiUrl": "https://cloudcontrol.onrender.com",
"apiKey": "cc_...",
"workerName": "worker-my-machine",
"teamName": "Epic Design Labs"
}Troubleshooting
Daemon not picking up tasks
- Check the profile:
cloudcontrol whoami— is it the right company? - Switch if needed:
cloudcontrol switch <correct-profile> - Check tasks exist:
cloudcontrol tasks --status pending - Check models:
cloudcontrol models— is at least one model available? - Check connection:
cloudcontrol status— is the cloud reachable?
LaunchAgent not starting
- Check logs:
cloudcontrol logs --errors - Reinstall:
cloudcontrol uninstall && cloudcontrol install - Verify PATH: The daemon needs CLI tools (claude, gemini) in its PATH. Run
cloudcontrol installfrom the same terminal where those tools work.
"No AI models available"
Install at least one:
# Option 1: Claude Code CLI
npm install -g @anthropic-ai/claude-code
# Option 2: API key
export ANTHROPIC_API_KEY=sk-ant-...
# Option 3: Gemini CLI
npm install -g @google/gemini-cliWrong company's tasks
cloudcontrol whoami # Check which company is active
cloudcontrol switch my-company # Switch
cloudcontrol tasks --status pending # Verify tasksTask stuck in "claimed" status
The watchdog service in the cloud automatically requeues stuck tasks after a timeout. You can also manually restart the daemon to release claims.
Process YAML
Tasks can follow step-by-step YAML workflows. Create them in the dashboard under Processes, or via the API.
Example (directory-submission.yaml):
name: "Directory Submission"
version: 1
task_type: "directory_submission"
description: "Submit a website to a directory"
credentials_needed:
- "gmail"
steps:
- id: "fetch_form"
type: "browser"
action: "navigate"
target: "{{ task.directory_url }}"
- id: "fill_form"
type: "ai"
prompt: "Fill the directory form with company info"
gate: "none"
- id: "wait_for_email"
type: "human"
prompt: "Check email for verification link"
gate: "approval_required"
- id: "verify_success"
type: "ai"
prompt: "Check if the listing is live"
gate: "none"Step types: ai, human, browser, api, shell
Gates: none (auto-continue), approval_required (pauses for human)
Environment Variables
| Variable | Purpose | Default |
|----------|---------|---------|
| ANTHROPIC_API_KEY | Claude API access | — |
| OPENAI_API_KEY | OpenAI API access | — |
| GOOGLE_API_KEY | Google Gemini API access | — |
| CLOUDCONTROL_OLLAMA_MODELS | Ollama models (comma-separated) | — |
| CLOUDCONTROL_CLI_MODELS | Custom CLI models | — |
| CLOUDCONTROL_LOG_LEVEL | Log level: debug, info, warn, error | info |
| CLOUDCONTROL_LOG_FORMAT | Log format: json for structured output | human-readable |
| CLOUDCONTROL_PROFILE | Default profile name | default |
Requirements
- Node.js 20+
- A CloudControl account (self-hosted or cloud at cloudcontrol.onrender.com)
- At least one AI model (CLI tool or API key)
