langchainjs-codex-oauth
v0.1.4
Published
Use ChatGPT Codex via OAuth as a LangChainJS chat model
Maintainers
Readme
langchainjs-codex-oauth
Use ChatGPT Codex models through OAuth inside LangChainJS and LangGraph.
In practice, this means you can connect the Codex access attached to your ChatGPT account - including ChatGPT Plus, Pro, Business, and Enterprise accounts when Codex is enabled - to LangChainJS and LangGraph without using an OpenAI API key.
[!IMPORTANT] This project is still in active development. Expect bugs, rough edges, and occasional breaking changes while the package stabilizes. Issues and pull requests are very welcome.
What it does
- Lets you use the Codex-capable models available in your ChatGPT account from LangChainJS and LangGraph.
- Reuses ChatGPT plan access instead of requiring OpenAI API billing or an API key.
- Exposes a
ChatCodexOAuthchat model implemented in TypeScript. - Authenticates locally with ChatGPT OAuth instead of an API key.
- Stores credentials under
~/.langchainjs-codex-oauth/by default. - Refreshes expired access tokens automatically.
- Streams text responses and tool-call chunks from the ChatGPT Codex backend.
- Supports direct
bindTools(...),withStructuredOutput(...), LangChain agents, and LangGraph workflows.
What this is, in plain English
- If you can use Codex from your ChatGPT account, this package lets your LangChainJS or LangGraph code use that same account access.
- It is useful for people who already pay for ChatGPT plans such as Plus, Pro, Business, or Enterprise and want to experiment with LangChainJS or LangGraph without switching to the API platform first.
- It does not turn a ChatGPT subscription into the official OpenAI API. It is an adapter around ChatGPT OAuth and the Codex access available to that account.
- Availability still depends on whether OpenAI has enabled Codex and the requested models for your plan or workspace.
Requirements
- Node.js
>=20 - A ChatGPT account with access to Codex-capable models, such as a Plus, Pro, Business, or Enterprise account with Codex enabled
Compliance note
[!CAUTION] This package is unofficial and is not legal advice.
I could not find OpenAI documentation that explicitly says ChatGPT OAuth account access is approved for general-purpose third-party automation through LangChainJS or LangGraph. OpenAI's Terms of Use currently restrict activities such as "automatically or programmatically extract data or Output" and "circumvent any rate limits or restrictions or bypass any protective measures or safety mitigations," and this package relies on undocumented ChatGPT/Codex endpoints rather than the official API platform.
Because of that, use of this project may be unsupported, a gray area, or inconsistent with OpenAI terms or workspace policies depending on how you use it and what kind of account you use. Review the current OpenAI Terms of Use, Service Terms, and any Business or Enterprise admin policies before using it in production or on an organization-managed workspace.
Install
Before installing, make sure the account you plan to use can already access Codex in ChatGPT or the Codex app. This package does not grant Codex access on its own.
For the core library:
pnpm add langchainjs-codex-oauth @langchain/coreAdd these only if you want the examples or higher-level helpers shown in this README:
pnpm add langchain @langchain/langgraph zodNotes:
- You do not need an OpenAI API key for this package.
- You do not need the
openaiSDK unless your app also talks to the official OpenAI API separately. langchainjs-codex-oauthis for ChatGPT-account-backed Codex access. If you want the official API platform instead, use the OpenAI API directly.
Authenticate
Authenticate once on the machine where you want to run LangChainJS or LangGraph with your ChatGPT account:
npx langchainjs-codex-oauth auth loginWhat this does:
- Opens the ChatGPT/OpenAI OAuth flow in your browser
- Signs in with the ChatGPT account whose Codex access you want to use
- Stores OAuth credentials locally so your code can reuse them without an API key
- Refreshes expired access tokens automatically later
If your browser cannot open automatically, localhost port 1455 is busy, your workspace requires a different browser/session, or you want to finish the OAuth flow by hand:
npx langchainjs-codex-oauth auth login --manualOther useful commands:
npx langchainjs-codex-oauth auth status
npx langchainjs-codex-oauth auth logoutFor local development in this repository:
pnpm auth:login
pnpm auth:status
pnpm auth:logoutNotes:
- The automatic flow starts a local callback server at
http://localhost:1455/auth/callback. auth login --manualaccepts either the full redirect URL or the raw authorization code.- Credentials are stored in
~/.langchainjs-codex-oauth/auth/openai.jsonby default. - That auth file gives local access to your ChatGPT/Codex session for this package, so treat it like a secret.
- Expired access tokens are refreshed automatically and written back to the auth file.
- If you use ChatGPT Business or Enterprise, make sure your workspace permits Codex access and local OAuth sign-in before relying on this setup.
Quickstart
import { HumanMessage, SystemMessage } from "@langchain/core/messages"
import { ChatCodexOAuth } from "langchainjs-codex-oauth"
const model = new ChatCodexOAuth({
model: "gpt-5.2-codex",
})
const result = await model.invoke([
new SystemMessage("You are a concise coding assistant."),
new HumanMessage("Say hello and give one TypeScript tip."),
])
console.log(result.text)ChatCodexOAuth also supports .stream(...), .batch(...), .bindTools(...), and .withStructuredOutput(...).
Advanced imports
If you need the lower-level auth store or raw Codex client, import them from subpaths:
import { AuthStore, defaultAuthPath } from "langchainjs-codex-oauth/auth"
import { CodexClient, DEFAULT_INCLUDE } from "langchainjs-codex-oauth/client"Tool Calling
import { HumanMessage, ToolMessage } from "@langchain/core/messages"
import { tool } from "@langchain/core/tools"
import { z } from "zod"
import { ChatCodexOAuth } from "langchainjs-codex-oauth"
const add = tool(async ({ a, b }) => `${a + b}`, {
name: "add_numbers",
description: "Add two integers and return the result.",
schema: z.object({
a: z.number().int(),
b: z.number().int(),
}),
})
const model = new ChatCodexOAuth({ model: "gpt-5.2-codex" }).bindTools([add])
const prompt = "What is 17 + 25? Use the add_numbers tool before answering."
const first = await model.invoke([new HumanMessage(prompt)])
const call = first.tool_calls?.[0]
if (!call?.id) {
throw new Error("The model did not emit a tool call.")
}
const output = await add.invoke(call)
const toolMessage =
typeof output === "string"
? new ToolMessage({
content: output,
tool_call_id: call.id,
})
: output
const final = await model.invoke([new HumanMessage(prompt), first, toolMessage])
console.log(final.text)Streaming tool calls is also supported. While streaming, the model emits tool-call argument deltas before the final tool_calls array is assembled.
Structured Output
import { z } from "zod"
import { ChatCodexOAuth } from "langchainjs-codex-oauth"
const ContactInfo = z.object({
name: z.string(),
email: z.string(),
})
const model = new ChatCodexOAuth({
model: "gpt-5.2-codex",
}).withStructuredOutput(ContactInfo)
const result = await model.invoke(
"Extract the contact info from: Jane Roe, [email protected].",
)
console.log(result)Structured output works through function calling. includeRaw: true is also supported when you want both the parsed payload and the raw AIMessage.
LangGraph
ChatCodexOAuth works in LangGraph agent loops and raw StateGraph workflows. See examples/langgraph/agent.ts and examples/README.md for a runnable example.
Configuration
Constructor options:
model: model name to request, defaultgpt-5.2-codextemperaturemaxTokensreasoningEffort:"none","low","medium"(default),"high", or"xhigh"reasoningSummary:"brief"or another provider-supported stringtextVerbosity:"low","medium"(default), or"high"include: for example["reasoning.encrypted_content"]timeout: request timeout in millisecondsmaxRetriesbaseURLauthPath
Environment variables:
LANGCHAINJS_CODEX_OAUTH_BASE_URLLANGCHAINJS_CODEX_OAUTH_TEMPERATURELANGCHAINJS_CODEX_OAUTH_MAX_TOKENSLANGCHAINJS_CODEX_OAUTH_TIMEOUT_SLANGCHAINJS_CODEX_OAUTH_MAX_RETRIESLANGCHAINJS_CODEX_OAUTH_HOMELANGCHAINJS_CODEX_OAUTH_AUTH_PATH
SystemMessage and LangChain developer chat messages are sent as the top-level backend instructions string, in order, joined with blank lines. Regular human, assistant, and tool messages are sent as normal conversation input items.
When no system or developer prompt is present, the client sends an empty instructions string because the backend currently rejects requests that omit the field entirely.
Examples
pnpm example:hello
pnpm example:tools
pnpm example:agentSee examples/README.md for details.
Live integration tests
pnpm test:intThese tests are skipped automatically when the local auth file is missing.
Release validation
pnpm build:releaseThis runs the full release gate: clean, lint, typecheck, unit tests, live integration tests, extended live integration tests, and the final distribution build.
Notes
- This package is Node-only.
- The package keeps its own auth store and does not read Codex/OpenCode credential files.
- The backend uses undocumented ChatGPT/Codex endpoints, so compatibility may require updates over time.
- This project is best understood as an unofficial bridge from ChatGPT account access to LangChainJS/LangGraph, not as a replacement for the official OpenAI API.
Contributing
Bug reports, questions, and fixes are welcome.
- Open an issue: https://github.com/jeph/langchainjs-codex-oauth/issues
- Open a pull request: https://github.com/jeph/langchainjs-codex-oauth/pulls
