@longrun-ai/codex-auth
v0.4.0
Published
ChatGPT OAuth helpers compatible with Codex auth.json
Readme
codex-auth
Node-friendly helpers for ChatGPT OAuth that are compatible with Codex
auth.json storage.
Install
pnpm add @longrun-ai/codex-authLibrary Usage
import {
AuthManager,
createChatGptClientFromManager,
createChatGptStartRequest,
runLoginServer,
} from '@longrun-ai/codex-auth';
const manager = new AuthManager();
const auth = await manager.auth();
if (!auth) {
const server = await runLoginServer({ openBrowser: true });
console.log(`Open this URL if your browser did not open: ${server.authUrl}`);
await server.waitForCompletion();
}
const client = await createChatGptClientFromManager(manager);
const payload = createChatGptStartRequest({
model: 'gpt-5.3-codex',
instructions: 'You are Codex CLI.',
tools: [{ type: 'web_search', external_web_access: true }],
userText: 'hello',
});
const response = await client.responses(payload);
const raw = await response.text();
console.log(raw);Continue a conversation from stored history:
import { createChatGptContinuationRequest } from '@longrun-ai/codex-auth';
const history = JSON.parse(historyJson);
const followup = createChatGptContinuationRequest({
model: 'gpt-5.3-codex',
instructions: 'You are Codex CLI.',
history,
userText: 'continue',
});
await client.trigger(followup, receiver);responses() returns an SSE stream; parse each data: payload as
ChatGptResponsesStreamEvent.
import type { ChatGptEventReceiver, ChatGptResponsesStreamEvent } from '@longrun-ai/codex-auth';
const receiver: ChatGptEventReceiver = {
onEvent(event: ChatGptResponsesStreamEvent) {
switch (event.type) {
case 'response.created':
console.log('created', event.response.id);
break;
case 'response.completed':
console.log('completed', event.response.id);
break;
case 'response.output_text.delta':
process.stdout.write(event.delta);
break;
default: {
const _exhaustive: never = event;
return _exhaustive;
}
}
},
};Use the receiver with the streaming helper:
await client.trigger(payload, receiver);Device code flow (headless):
import { runDeviceCodeLogin } from 'codex-auth';
await runDeviceCodeLogin({
onDeviceCode: (code) => {
console.log('Visit:', code.verificationUrl);
console.log('User code:', code.userCode);
},
});Auth Doctor (CLI)
The package ships a CLI that inspects auth.json and reports status.
It runs a ChatGPT chat probe unless --no-verify is used.
npx @longrun-ai/codex-auth --jsonRefresh tokens (if available):
npx @longrun-ai/codex-auth --refreshSkip the verification request (no LLM call):
npx @longrun-ai/codex-auth --no-verifyDump SSE events from the verification request:
npx @longrun-ai/codex-auth --verboseProbe remote /models (enabled by default) and print model metadata:
npx @longrun-ai/codex-auth --probe-modelsDisable /models probe:
npx @longrun-ai/codex-auth --no-probe-modelsOverride model or base URL:
npx @longrun-ai/codex-auth --model gpt-5.3-codex \
--chatgpt-base-url https://chatgpt.com/backend-api/Override CODEX_HOME:
npx @longrun-ai/codex-auth --codex-home /path/to/.codexNotes
- Default
CODEX_HOMEis~/.codexunless overridden. - The CLI uses the same file schema as Codex Rust.
- Reasoning/thinking SSE events (
response.reasoning_*) only stream when the request enablesreasoning(and typically includesreasoning.encrypted_content). - Built-in tool payloads are supported, including native
web_search/local_shelland function/custom tools via thetoolsfield. - Proxy env vars are detected via
HTTP_PROXY,HTTPS_PROXY, andNO_PROXY(case-insensitive). If set, the verification request uses them.
