enrai
v0.1.19
Published
Portable internal coding workspace with editor, terminal, preview, file explorer, and Enrai chat.
Maintainers
Readme
Enrai
Enrai is a self-hosted coding workspace that combines an editor, terminal, preview, file explorer, and an AI chat interface in a single web app.
It is built for local-first and internal deployments where the app is expected to run on a trusted machine or behind a firewall, with access to a real project workspace and shell environment.
Features
- Monaco-based code editor with tabs, quick open, save flow, and jump-to-file references.
- Embedded terminal with PTY-backed shell sessions.
- AI chat panel with streaming responses, rollout switching, model switching, and working directory control.
- Backend Codex task queue for enqueuing work against a new or existing rollout.
- File explorer with create, rename, delete, upload, drag-and-drop, and right-click actions.
- Local preview tabs with a branded fallback screen when the target app is unavailable.
- Sync between active rollout, AI working directory, and explorer root.
- Portable mode that serves the frontend and backend from the same Express process.
npx-ready CLI for local startup and portable deployment.
Repository Layout
frontend/ Next.js application
backend/ Express API, WebSocket server, workspace logic
backend/codex/ Embedded Enrai session/rollout handler
bin/ CLI entrypoints
scripts/ Setup scriptsImportant paths:
frontend/src/app/frontend/src/components/frontend/src/lib/backend/server.jsbin/enrai.cjs
Architecture
Frontend
- Next.js 14 App Router
- React 18
- Tailwind CSS
- Monaco Editor
- xterm
Backend
- Express
- WebSocket
node-pty- embedded Enrai session handler
Codex Task Queue
The embedded Codex backend exposes a queue API at /api/codex/tasks.
Behavior:
- If you send a
rolloutorresume_path, the task resumes that rollout and continues from its thread history. - If you do not send a rollout, the backend starts a new Codex session for that task.
- Tasks are executed one at a time.
- Queue state is also broadcast over the existing Codex SSE stream as
task_queue.
Example:
curl -X POST http://localhost:3001/api/codex/tasks \
-H 'Content-Type: application/json' \
-d '{
"text": "Add a health endpoint and test it",
"rollout": "rollout-2026-03-18T20-28-17-019d03b5-2ac6-77a3-91d9-97c1c18d5f2d.jsonl",
"workdir": "./backend/projects/demo"
}'Accepted payload fields:
textorprompt: required task textrollout,resume_path, orpath: optional rollout to continueworkdir: optional working directory overrideinstructions: optional extra instructions for this queued taskmanifest: optional app-level manifest object/string that is injected into the task promptmetadata: optional opaque metadata stored with the task
Routes:
GET /api/codex/tasksPOST /api/codex/tasksDELETE /api/codex/tasks/:id
Codex Behavior Model
There is no OpenAI-defined “queue manifest” for Codex jobs that this project can submit as a first-class object.
The closest official control points are:
AGENTS.mdinside the repository, which Codex uses as project guidancemodel_instructions_filein~/.codex/config.tomlfor global instructions- additional runtime developer instructions passed by the harness itself
In Enrai, the queue endpoint therefore supports a lightweight app-level manifest field. This is not an OpenAI standard; it is simply serialized into the task prompt so you can define execution preferences per queued job without mutating global Codex config.
Requirements
- Node.js 22+
- npm
- Linux, macOS, or WSL for PTY support
Quick Start
Install dependencies:
npm install
cd backend && npm installRun the frontend:
npm run devRun the backend in another terminal:
cd backend
npm run devDefault URLs:
- Frontend:
http://localhost:3000/editor - Backend:
http://localhost:3001
Portable Mode
Portable mode serves the Next.js frontend from the same Express process used by the backend.
Build:
npm run build:portableStart:
npm run start:portableURL:
http://localhost:3001/editorNotes:
- Portable builds use
.next-portable. - Do not reuse the regular
.nextdevelopment output for portable mode.
CLI / NPX
The repository is prepared to run as:
npx enrai@latest startSupported commands:
npx enrai@latest start
npx enrai@latest start --port 4010 --workdir ./backend/projects/demo
npx enrai@latest build
npx enrai@latest rebuild
npx enrai@latest bootstrap-ubuntu
npx enrai@latest task enqueue "Add a health endpoint"
npx enrai@latest task list
npx enrai@latest task cancel task-3
npx enrai@latest --config ./enrai.config.jsonWhat the CLI does:
- installs backend dependencies if they are missing
- builds portable mode on first run
- starts the app through the unified Express server
Configuration
The CLI supports a local JSON config file.
Default lookup order:
--config ./path/to/file.jsonENRAI_CONFIG./enrai.config.json./.enrai/config.json~/.config/enrai/config.json
A starter file is included at enrai.config.example.json.
Example:
{
"port": 3001,
"editorWorkdir": "./backend/projects",
"nextDistDir": ".next-portable",
"serveNextFromExpress": true,
"verbose": false,
"env": {
"OPENAI_API_KEY": ""
}
}CLI flags override config file values. Effective precedence is:
- CLI flags
- config file
- environment variables
- built-in defaults
Useful flags:
--config ./enrai.config.json
--port 3001
--workdir ./backend/projects
--next-dist-dir .next-portable
--serve-next-from-express
--verbose
--codex-home ~/.codex
--codex-config-path ~/.codex/config.toml
--openai-api-key sk-...Task commands:
enrai task enqueue "Refactor the auth middleware"
enrai task enqueue --rollout rollout-2026-03-18T20-28-17-019d03b5-2ac6-77a3-91d9-97c1c18d5f2d.jsonl "Continue the refactor"
enrai task enqueue --instructions "Run tests before finishing" --manifest '{"goal":"safe-change"}' "Add a new endpoint"
enrai task list
enrai task cancel task-1Environment Variables
Copy .env.example and override only what you need.
Backend
| Variable | Default | Description |
| --- | --- | --- |
| PORT | 3001 | Express backend port |
| EDITOR_WORKDIR | ./backend/projects | Default workspace root |
| SERVE_NEXT_FROM_EXPRESS | unset | Serve the frontend from Express when set to 1 |
| NEXT_DIST_DIR | .next | Next.js build directory |
| ENRAI_VERBOSE | 0 | Enables detailed backend logs when truthy |
| PREFERRED_PROVIDER | cerebras | Default AI provider |
| CEREBRAS_API_KEY | unset | Cerebras API key |
| CEREBRAS_MODEL | unset | Optional Cerebras model override |
| OPENAI_API_KEY | unset | OpenAI API key |
| OPENAI_MODEL | unset | Optional OpenAI model override |
| CODEX_HOME | unset | Optional Codex home directory |
| CODEX_CONFIG_PATH | unset | Optional Codex config path |
Frontend
| Variable | Default | Description |
| --- | --- | --- |
| NEXT_PUBLIC_EDITOR_BACKEND_URL | derived from current origin | Explicit backend origin for frontend API calls |
PM2
Use PM2 if you want to keep Enrai running on a server or internal workstation.
npm run build:portable
npm run pm2:start
pm2 save
sudo env PATH=$PATH:/usr/bin pm2 startup systemd -u "$USER" --hp "$HOME"Useful commands:
npm run pm2:logs
npm run pm2:restart
npm run pm2:stopUbuntu Bootstrap
The repository includes scripts/bootstrap-ubuntu.sh to prepare an Ubuntu machine with:
- Node.js 22
zsh- Oh My Zsh
fishfzfripgrep- Codex CLI
- PM2
Run:
npm run bootstrap:ubuntuSecurity Model
Enrai is designed for:
- local development
- internal teams
- trusted networks
- firewall-protected environments
Why:
- the backend can read and write real workspace files
- the terminal can execute shell commands
- AI sessions can operate directly on the project tree
If you want to expose it publicly, add your own hardening layer first:
- authentication
- reverse proxy restrictions
- process isolation
- access control
- audit logging
Development Notes
- The frontend lives in
frontend/. - The backend lives in
backend/. - Runtime workspace content lives in
backend/projects/. backend/codex/is embedded as a subsystem, not a separate service.
License
MIT. See LICENSE.
