llm-schemas
v1.0.2
Published
Shared Zod schemas for LLM API request validation (OpenAI Chat Completions, OpenAI Responses, Anthropic Messages)
Downloads
662
Maintainers
Readme
llm-schemas

Shared Zod schemas for validating LLM API request bodies. Covers OpenAI Chat Completions, OpenAI Responses, and Anthropic Messages. Used by both llm-mock-server and copilot-sdk-proxy.
Table of contents
Install
npm install llm-schemasRequires Node.js 22+ and Zod 4.
Usage
Import from a sub-path to get schemas for a specific API, or import from the top level to get everything namespaced.
OpenAI Chat Completions
import {
OpenAIRequestSchema,
MessageSchema,
ToolSchema,
} from "llm-schemas/openai/chat-completions";
const body = OpenAIRequestSchema.parse(req.body);
console.log(body.model, body.messages);OpenAI Responses
import {
ResponsesRequestSchema,
FunctionToolSchema,
InputMessageSchema,
} from "llm-schemas/openai/responses";
const body = ResponsesRequestSchema.parse(req.body);
console.log(body.model, body.input);Anthropic Messages
import {
AnthropicRequestSchema,
MessageSchema,
ToolDefinitionSchema,
} from "llm-schemas/anthropic";
const body = AnthropicRequestSchema.parse(req.body);
console.log(body.model, body.max_tokens, body.messages);Top-level import
If you need all three in one place, the root export namespaces them so there are no naming collisions.
import { openai, anthropic } from "llm-schemas";
openai.chatCompletions.OpenAIRequestSchema.parse(body);
openai.responses.ResponsesRequestSchema.parse(body);
anthropic.AnthropicRequestSchema.parse(body);What's in each schema
Request schemas
All three request schemas use z.looseObject() at the top level, so unknown fields pass through without failing validation. Every field from the official API specs is included. Fields that don't need parsing are typed as z.unknown().optional().
Specs the schemas were built from:
| Schema | Required fields | Total fields |
| ------ | --------------- | ------------ |
| OpenAIRequestSchema | model, messages | 35 |
| AnthropicRequestSchema | model, max_tokens, messages | 18 |
| ResponsesRequestSchema | (none) | 29 |
The Responses API spec makes both model and input optional. If your app needs them to be required, add a .refine() after parsing.
Building blocks
OpenAI Chat Completions
ContentPartSchema- a content part like{ type: "text", text: "..." }(looseObject)MessageSchema- a chat message with role, content, tool_calls, etc.ToolSchema- a tool definition withfunction.name,function.parameters,function.strict
Anthropic Messages
TextBlockSchema,ToolUseBlockSchema,ToolResultBlockSchema- typed content blocksLooseContentBlockSchema- union of known blocks + a fallback for unknown typesMessageSchema- a message with role and string or block array contentToolDefinitionSchema- a tool withname,description,input_schema
OpenAI Responses
InputMessageSchema- a conversation message in the input arrayFunctionCallInputSchema,FunctionCallOutputSchema- function call round-trip itemsFunctionToolSchema- a function tool (withstrictfield)RawToolSchema- accepts any tool shape asRecord<string, unknown>
Every schema also exports its inferred TypeScript type (e.g. OpenAIRequest, AnthropicRequest, Message, FunctionTool).
Import paths
| Path | What you get |
| ---- | ------------ |
| llm-schemas | Everything, namespaced as openai.chatCompletions, openai.responses, anthropic |
| llm-schemas/openai | Both OpenAI APIs, namespaced as chatCompletions and responses |
| llm-schemas/openai/chat-completions | Chat completions schemas only |
| llm-schemas/openai/responses | Responses schemas only |
| llm-schemas/anthropic | Anthropic schemas only |
Development
npm run build # Compile TypeScript
npm test # Run tests
npm run lint # Lint with oxlint
npm run check # All three: typecheck + lint + test