@denieler/e2b-codex
v0.1.9
Published
Build and use E2B sandboxes with Codex preinstalled and codex app-server ready over websocket.
Readme
@denieler/e2b-codex
Use a shared E2B template with Codex preinstalled, then create fresh sandboxes from that template and talk to codex app-server over websocket.
This project covers two jobs:
- runtime use: create a sandbox, connect to Codex, send prompts, continue conversations
- template development: build and publish the shared E2B template
Runtime Use
Install the package:
npm install @denieler/e2b-codexWhat you need
E2B_API_KEYOPENAI_PROXY_TOKENE2B_TEMPLATE_ID
E2B_TEMPLATE_ID must point to a template that already has Codex installed.
Run one prompt
If you want to run the checked-in smoke test, run:
E2B_TEMPLATE_ID=<template-id> doppler run -- npm run test:smokeWhat this does:
- Creates a new sandbox from
E2B_TEMPLATE_ID - Requests a short-lived proxy token from
https://openai-proxy-denieler.fly.dev/auth/token - Signs that auth request with
OPENAI_PROXY_HMAC_SECRET - Verifies the returned JWT locally with
OPENAI_PROXY_TOKEN_SECRET - Starts
codex app-serverinside the sandbox using that token - Opens an authenticated websocket connection
- Sends two turns on the same thread, both involving tool calls
- Fails if any reply is not the expected final text
For the smoke test, Doppler must provide:
E2B_API_KEYE2B_TEMPLATE_IDOPENAI_PROXY_HMAC_SECRETOPENAI_PROXY_TOKEN_SECRET
The proxy app listens on internal port 8080, but Fly exposes it publicly on standard HTTPS, so the smoke test uses the public URL without :8080.
Create a ready sandbox in code
import { createReadyCodexSandbox } from "@denieler/e2b-codex";
const ready = await createReadyCodexSandbox({
e2bApiKey: process.env.E2B_API_KEY!,
templateId: process.env.E2B_TEMPLATE_ID!,
openAiProxyToken: process.env.OPENAI_PROXY_TOKEN!,
userId: "user-123",
});The returned object includes:
sandboxIdwebsocketUrlauthTokenworkspaceRoot
Connect to an existing sandbox in code
import { connectCodexSandbox } from "@denieler/e2b-codex";
const ready = await connectCodexSandbox({
e2bApiKey: process.env.E2B_API_KEY!,
sandboxId: "sandbox-id",
openAiProxyToken: process.env.OPENAI_PROXY_TOKEN!,
userId: "user-123",
});Use this when you already persisted sandboxId and want to reconnect to the same sandbox instead of creating a new one.
This is also the path to use when a user rejoins later with a fresh OPENAI_PROXY_TOKEN.
On reconnect, the runtime rewrites the Codex config and restarts codex app-server so the newly supplied proxy token is used.
Connect to Codex over websocket
import { connectCodexClient } from "@denieler/e2b-codex";
const client = await connectCodexClient(ready);Start a conversation
const started = await client.request("thread/start", {
model: "gpt-5.3-codex",
cwd: ready.workspaceRoot,
});
const threadId = String((started.thread as { id?: string }).id);Send turns on the same thread
await client.request("turn/start", {
threadId,
input: [{ type: "text", text: "First message" }],
cwd: ready.workspaceRoot,
model: "gpt-5.3-codex",
effort: "medium",
approvalPolicy: "never",
sandboxPolicy: {
type: "workspaceWrite",
writableRoots: [ready.workspaceRoot],
networkAccess: true,
},
summary: "concise",
});To continue the same conversation:
- keep the sandbox alive
- reuse the same
threadId - send another
turn/start
If your websocket connection drops, reconnect using the same websocketUrl and authToken, then continue using the same threadId.
One-call helper
If you want one fresh sandbox and one prompt, use:
import { createReadyCodexSandbox, runPrompt } from "@denieler/e2b-codex";
const sandbox = await createReadyCodexSandbox({
e2bApiKey: process.env.E2B_API_KEY!,
templateId: process.env.E2B_TEMPLATE_ID!,
openAiProxyToken: process.env.OPENAI_PROXY_TOKEN!,
userId: "user-123",
});
const result = await runPrompt({
sandbox,
prompt: "Summarize this workspace in one sentence.",
});
console.log(result.reply);Template Development
Use this section if you are maintaining the template itself.
Install
npm installBuild the template
doppler run -- npm run build:templateThe build prints:
- template name
- template id
- build id
Use the printed template id as E2B_TEMPLATE_ID for runtime use.
If you keep using an older template, the runtime still writes the same Codex provider config at sandbox startup. Rebuilding the template just bakes that config in ahead of time.
What the build does
The template build:
- starts from E2B
base - installs a small set of system packages
- downloads the Codex Linux binary from OpenAI GitHub releases
- installs it to
/usr/local/bin/codex - writes
/root/.codex/config.tomlwith thecustom_openai_proxyprovider
Codex is installed at template build time, not at sandbox startup.
Useful commands
npm install
npm run typecheck
doppler run -- npm run build:template
E2B_TEMPLATE_ID=<template-id> doppler run -- npm run test:smokeVerification
After any substantial runtime, websocket, or sandbox change, run:
npm run typecheck
npm run build
E2B_TEMPLATE_ID=<template-id> doppler run -- npm run test:smokeThe smoke test covers both of these paths:
- proxy auth token issuance with signed
/auth/tokenrequests and local JWT verification - direct websocket usage with
thread/startplus twoturn/startcalls on the same thread, both involving tool calls
Runtime Design
At sandbox startup, the runtime:
- injects
OPENAI_PROXY_TOKENinto the sandbox environment - writes
~/.codex/config.tomlselecting thecustom_openai_proxyprovider athttps://openai-proxy-denieler.fly.dev/v1 - writes a websocket capability token into the sandbox
- starts
codex app-serveras an E2B background process - restarts
codex app-serveron sandbox reconnect so a newly suppliedOPENAI_PROXY_TOKENtakes effect - waits for it to stay alive
- opens the websocket connection
codex app-server is started at runtime, not stored as a pre-running process in the template.
Notes
- The template is shared. Sandboxes are ephemeral.
- Secrets are injected when the sandbox is created.
OPENAI_API_KEYis not used by this package.- The websocket token file is written to
/tmp/e2b-codex-ws-token. - The example uses
approvalPolicy: "never"andworkspaceWrite.
Publish
Build the package:
npm run buildPack it locally:
npm packPublish when ready:
npm publish