@flue/cli
v0.0.49
Published
CLI for running [Flue](https://github.com/withastro/flue) AI-enabled workflows, locally or in GitHub Actions.
Readme
@flue/cli
CLI for running Flue AI-enabled workflows, locally or in GitHub Actions.
Install
bun install @flue/cli
npm install @flue/cli
pnpm install @flue/cliUsage
flue run <workflowPath> [--args <json>] [--model <provider/model>] [--sandbox <image>]flue run .flue/workflows/triage.ts
flue run .flue/workflows/triage.ts --model anthropic/claude-sonnet-4-5
flue run .flue/workflows/triage.ts --args '{"issueNumber": 123}'
flue run .flue/workflows/triage.ts --sandbox my-org/my-sandbox:latestThe CLI auto-starts an OpenCode server if one isn't already running. The opencode binary must be installed and on PATH.
Sandbox Mode
The --sandbox <image> flag runs the OpenCode server inside a Docker container for security isolation. The LLM and its tool calls execute inside the container, while the host retains control of secrets (like API keys). Shell commands via flue.shell() also execute inside the container.
Prerequisites: Docker (GitHub Actions supported). Your container image must have OpenCode and git installed, and should start the OpenCode server on port 48765. Any other tools your workflows need (e.g., curl, pnpm) can be added to the image as well.
Model Configuration
The CLI uses the local OpenCode server's model configuration. Either:
- Pass
--modelto the CLI:flue run workflow.ts --model anthropic/claude-sonnet-4-5 - Or set
"model"in your project'sopencode.json
Provider API keys (e.g. ANTHROPIC_API_KEY, OPENAI_API_KEY) are read from the environment at runtime.
