json-rest-schema
v1.0.14
Published
A flexible and extensible schema validation library for JavaScript objects, designed for REST APIs and beyond. Features include type casting, data transformation, and a pluggable architecture for custom rules.
Maintainers
Readme
Tutorial
Welcome! This tutorial will walk you through everything you need to know to use the schema validation library effectively. We'll start with the basics and progressively move to more advanced topics like creating your own custom rules.
Published documentation:
https://mobily-enterprises.github.io/json-rest-schema/
Installation
Install the package in your app with:
npm install json-rest-schemaIf you are working in this repo and want to run the documentation site locally:
npm install
npm run docs:devBuild the full static site, including the standalone React and Vue demo apps, with:
npm run docs:buildPreview that built site locally with:
npm run docs:previewThe published docs site is the best place to read the same manual as shorter chapters for:
- create / replace / patch semantics
- nested object and array contracts
- recursive runtime validation and transport export
- field introspection and path-scoped validation
- React Hook Form, Vue + Vuetify, and VeeValidate adapters
- demo app walkthroughs
Getting Started: Your First Schema
Let's start with a common use case: validating a user registration form.
First, import the library's factory function and define the structure of the data you expect.
import { createSchema } from 'json-rest-schema';
// Define the structure and rules for our user data
const userSchema = createSchema({
username: { type: 'string', required: true, minLength: 3 },
email: { type: 'string', required: true },
age: { type: 'number', min: 18, defaultTo: 18 }
});Now, let's try to validate an object against this schema.
// An example input object from a form
const userInput = {
username: ' alex ', // Includes extra whitespace
email: '[email protected]',
age: '25' // Note: age is a string here
};
const { validatedObject, errors } = userSchema.create(userInput);
// Check if there were any errors by seeing if the errors object has keys
if (Object.keys(errors).length > 0) {
console.log("Validation failed!");
console.log(errors);
} else {
console.log("Validation successful!");
console.log(validatedObject);
}What happens here?
- The
agestring'25'is cast to the number25by thenumbertype handler. - The
usernamestring' alex 'is transformed by thestringtype handler to'alex'(it gets trimmed). - Since there are no validation errors, the
errorsobject will be empty. - The
validatedObjectwill contain the clean, cast, and transformed data.
Validation Results and Error Helpers
The schema operation methods return an object with two properties: validatedObject and errors.
The validatedObject
This object contains the data after all casting and transformations have been applied. It's the "clean" version of your input that you should use in the rest of your application (e.g., to save to a database).
The errors Object
This is your primary tool for handling validation failures.
- It's a Map, Not an Array: The
errorsobject is a map where keys are the field names that failed. This allows you to instantly check if a specific field has an error:if (errors.age) { ... }. - Rich Error Structure: Each error in the map is a detailed object:
{ code, message, params }. - Nested paths stay flat: Nested fields are reported with dotted paths such as
workspace.slugorroles.2.id. That keeps the external error contract simple even when schemas are recursive.
Let's look at an example with invalid data:
const invalidInput = {
username: 'Al', // Fails 'minLength: 3'
// email is missing, fails 'required: true'
age: 16 // Fails 'min: 18'
};
const { validatedObject, errors } = userSchema.create(invalidInput);
console.log(JSON.stringify(errors, null, 2));The output would look like this:
{
"username": {
"field": "username",
"code": "MIN_LENGTH",
"message": "Length must be at least 3 characters.",
"params": { "min": 3, "actual": 2 }
},
"email": {
"field": "email",
"code": "REQUIRED",
"message": "Field is required",
"params": {}
},
"age": {
"field": "age",
"code": "MIN_VALUE",
"message": "Value must be at least 18.",
"params": { "min": 18, "actual": 16 }
}
}code: A stable, machine-readable string. Use this in your code for logic (if (err.code === 'MIN_LENGTH')).message: A human-readable message, great for developers or for displaying directly to users in simple cases.params: Extra context about the failure. This is incredibly useful for creating dynamic error messages (e.g., "You entered 2 characters, but a minimum of 3 is required.").
Error helper utilities
If you want a few adapter-friendly utilities around that flat error map, import them directly:
import { createSchema, getError, hasError, nestErrors, flattenErrors } from 'json-rest-schema'getError(errors, path) reads one dotted-path error:
const slugError = getError(errors, 'workspace.slug')hasError(errors, path) is the small boolean version:
const showSlugError = hasError(errors, 'workspace.slug')nestErrors(errors) converts the flat map into a nested object/array shape for form libraries that prefer nested field errors:
nestErrors({
'workspace.slug': {
field: 'workspace.slug',
code: 'MIN_LENGTH',
message: 'Length must be at least 3 characters.',
params: { min: 3, actual: 1 }
},
'roles.2.label': {
field: 'roles.2.label',
code: 'REQUIRED',
message: 'Field is required',
params: {}
}
})Result:
{
workspace: {
slug: {
field: 'workspace.slug',
code: 'MIN_LENGTH',
message: 'Length must be at least 3 characters.',
params: { min: 3, actual: 1 }
}
},
roles: [
,
,
{
label: {
field: 'roles.2.label',
code: 'REQUIRED',
message: 'Field is required',
params: {}
}
}
]
}That keeps the runtime contract flat, while still making it easy to adapt into nested UI-state libraries.
flattenErrors(nestedErrors) does the reverse when a UI layer gives you nested field errors and you want to normalize them back into the library's flat contract:
flattenErrors({
workspace: {
slug: {
field: 'workspace.slug',
code: 'MIN_LENGTH',
message: 'Length must be at least 3 characters.',
params: { min: 3, actual: 1 }
}
},
roles: [
,
,
{
label: {
field: 'roles.2.label',
code: 'REQUIRED',
message: 'Field is required',
params: {}
}
}
]
})Result:
{
'workspace.slug': {
field: 'workspace.slug',
code: 'MIN_LENGTH',
message: 'Length must be at least 3 characters.',
params: { min: 3, actual: 1 }
},
'roles.2.label': {
field: 'roles.2.label',
code: 'REQUIRED',
message: 'Field is required',
params: {}
}
}Operation Contracts
For explicit write semantics, the schema instance exposes three synchronous built-in operation methods:
userSchema.create(input);
userSchema.replace(input);
userSchema.patch(input);They all return the same { validatedObject, errors } shape, but they differ in how omitted fields are treated:
create(): validates a create payload, enforcesrequired, appliesdefaultTo, and leaves omitted optional fields omitted.replace(): validates a full replacement payload, enforcesrequired, appliesdefaultTo, and preserves omitted fields.patch(): validates only explicitly provided fields and returns only the normalized fields that were touched.
These are built-in named operation contracts, not special cases in the engine. If you need a different contract, define a custom operation and the schema will generate a matching method alias automatically.
Worked operation example
This is easier to understand with one schema and three calls:
const profileSchema = createSchema({
username: { type: 'string', required: true },
bio: { type: 'string' },
role: { type: 'string', defaultTo: 'member' }
})Calling create():
profileSchema.create({
username: ' alex '
})Result:
{
validatedObject: {
username: 'alex',
role: 'member'
},
errors: {}
}Calling replace() with the same payload:
profileSchema.replace({
username: ' alex '
})Result:
{
validatedObject: {
username: 'alex',
role: 'member'
},
errors: {}
}Calling patch() with the same payload:
profileSchema.patch({
username: ' alex '
})Result:
{
validatedObject: {
username: 'alex'
},
errors: {}
}That difference is the whole point of operation contracts:
create()andreplace()walk the schema as a contract for the whole object.patch()walks only the fields the caller actually touched.- Defaults apply on operations that opt into defaults.
- Missing required fields are only errors on operations that opt into required checks.
Custom Operations
Custom operations are declared when you create the schema.
const userSchema = createSchema({
id: { type: 'id', required: true },
email: { type: 'string', required: true },
role: { type: 'string', defaultTo: 'guest' }
}, {
operations: {
upsert: {
targetFields: 'schema',
enforceRequired: false,
applyDefaults: true,
outputFields: 'validated'
}
}
})
const result = userSchema.upsert({ email: '[email protected]' })
const sameResult = userSchema.validateWith('upsert', { email: '[email protected]' })Operation aliases are generated automatically from the operation registry, so create, replace, and patch keep working exactly as before. If you intentionally redefine one of those names in operations, the built-in behavior is replaced for that schema instance.
Supported operation descriptor keys:
| Key | Allowed Values | Meaning |
|---|---|---|
| targetFields | 'schema' or 'input' | Which fields are validated. 'schema' walks the schema definition, 'input' only validates explicitly provided fields. |
| enforceRequired | true or false | Whether missing required fields produce errors. |
| applyDefaults | true or false | Whether omitted fields with defaultTo are materialized into the result. |
| outputFields | 'validated' or 'input' | Which field set is considered when building validatedObject. 'validated' follows schema fields, 'input' follows only explicitly provided fields. |
| rejectExplicitUndefined | true or false | Whether an explicitly provided undefined value is treated as a type error. Defaults to true. |
Method names are generated automatically from operation names. Names that already exist on Schema are reserved and rejected. In practice that means names such as validateWith, toJsonSchema, getFieldDefinitions, getFieldDefinition, getFieldMessages, and cleanup cannot be used as operation aliases.
Field Introspection
Schema instances also expose three read-only introspection helpers:
schema.getFieldDefinitions()returns a frozen snapshot map of the top-level field definitions.schema.getFieldDefinition(path)resolves one field definition by dotted path, including nested object fields and numeric array segments such asroles.0.id, and returns it as a frozen snapshot.schema.getFieldMessages(path)returns the field'smessagesobject as a frozen snapshot, or{}when none exist.
These helpers are intentionally inspection-only. They clone the schema metadata they expose so adapter code can read field settings without gaining a back door to mutate runtime validation behavior.
Example:
const roleSchema = createSchema({
id: { type: 'string', required: true }
})
const teamSchema = createSchema({
name: { type: 'string', required: true },
roles: {
type: 'array',
items: roleSchema
}
})
const topLevelDefinitions = teamSchema.getFieldDefinitions()
const roleIdDefinition = teamSchema.getFieldDefinition('roles.0.id')
const roleIdMessages = teamSchema.getFieldMessages('roles.0.id')In that example:
topLevelDefinitionscontains snapshots fornameandrolesroleIdDefinitionresolves through the array item schema to the nestedidfieldroleIdMessagesreturns{}because that field did not define amessagesobject
Treat the returned objects as metadata for rendering and adapter logic, not as something to mutate.
Nested Object, Array, and Object-Bag Contracts
This library now supports the three nested contract shapes that come up constantly in shared REST payloads, without turning into a generic schema engine:
- Nested object fields with
type: 'object'andschema - Nested array items with
type: 'array'anditems - Opaque object bags with
type: 'object'andadditionalProperties: true
The important design rule is that these are still application contracts, not arbitrary JSON Schema fragments.
Nested object fields
Use a child Schema instance when a field should itself be validated as an object.
const workspaceSummarySchema = createSchema({
id: { type: 'id', required: true },
slug: { type: 'string', required: true },
ownerUserId: { type: 'id', required: true }
})
const workspaceSettingsSchema = createSchema({
invitesEnabled: { type: 'boolean', required: true }
})
const workspaceViewSchema = createSchema({
workspace: {
type: 'object',
required: true,
schema: workspaceSummarySchema
},
settings: {
type: 'object',
required: true,
schema: workspaceSettingsSchema
}
})How nested object fields behave:
- The nested schema inherits the parent operation contract.
create()on the parent runscreate-style rules inside the child.patch()on the parent runspatch-style rules inside the child.- Errors are reported with dotted paths such as
workspace.slug. - Unknown nested keys are rejected because child schemas are strict by default, just like top-level schemas.
That operation inheritance is deliberate. A nested object inside a patch payload is usually itself a patch payload.
Worked nested object example
Using the schema above:
const result = workspaceViewSchema.create({
workspace: {
id: '42',
slug: ' main-workspace ',
extra: true
},
settings: {}
})validatedObject becomes:
{
workspace: {
id: 42,
slug: 'main-workspace'
},
settings: {}
}errors becomes:
{
'workspace.ownerUserId': {
field: 'workspace.ownerUserId',
code: 'REQUIRED',
message: 'Field is required',
params: {}
},
'workspace.extra': {
field: 'workspace.extra',
code: 'FIELD_NOT_ALLOWED',
message: 'Field not allowed',
params: {}
},
'settings.invitesEnabled': {
field: 'settings.invitesEnabled',
code: 'REQUIRED',
message: 'Field is required',
params: {}
}
}Now compare that with a nested patch:
workspaceViewSchema.patch({
workspace: {
slug: ' sandbox '
}
})Result:
{
validatedObject: {
workspace: {
slug: 'sandbox'
}
},
errors: {}
}Notice what did not happen:
workspace.idwas not requiredworkspace.ownerUserIdwas not required- no defaults were invented
That is exactly because the child object inherited the parent patch contract.
Nested array items
Use items when every array entry should be validated recursively.
const roleSchema = createSchema({
id: { type: 'string', required: true },
label: { type: 'string', required: true }
})
const roleCatalogSchema = createSchema({
roles: {
type: 'array',
required: true,
items: roleSchema
},
assignableRoleIds: {
type: 'array',
required: true,
items: { type: 'string', minLength: 1 }
}
})How array items behave:
- Primitive item definitions are validated item-by-item and normalized in place.
- If
itemsis a nested object schema, each item is validated inreplacemode. - Array item errors use indexed dotted paths such as
roles.0.label.
That replace rule for object items is intentional. If a client sends the roles array in a patch, they are replacing the array field, so each object item still needs to be complete.
Worked nested array example
const result = roleCatalogSchema.patch({
roles: [
{ id: 'admin' },
{ id: 'editor', label: ' Editor ' }
],
assignableRoleIds: [' owner ', ' ', 123]
})validatedObject becomes:
{
roles: [
{ id: 'admin' },
{ id: 'editor', label: 'Editor' }
],
assignableRoleIds: ['owner', '', '123']
}errors becomes:
{
'roles.0.label': {
field: 'roles.0.label',
code: 'REQUIRED',
message: 'Field is required',
params: {}
},
'assignableRoleIds.1': {
field: 'assignableRoleIds.1',
code: 'MIN_LENGTH',
message: 'Length must be at least 1 characters.',
params: { min: 1, actual: 0 }
}
}This example shows both supported array styles:
rolesuses a childSchemainstance for structured object itemsassignableRoleIdsuses an inline field definition for primitive items
Opaque object bags
If a field needs to be “some object, but not one this library owns”, make that explicit:
const schema = createSchema({
metadata: {
type: 'object',
additionalProperties: true
}
})That means:
- the value must be a plain object
- keys are not validated
- values pass through untouched
This is the intended escape hatch for metadata bags and adapter-owned payloads. It is deliberately narrow: additionalProperties only supports the literal value true. You can combine it with schema when you want to validate known child fields while still allowing arbitrary passthrough keys.
Typed Object Maps
If you need "an object whose keys are dynamic, but whose values all follow one contract", use values:
const schema = createSchema({
fieldErrors: {
type: 'object',
values: {
type: 'string',
minLength: 1
}
}
})That means:
- the value must be a plain object
- keys remain dynamic
- every value is validated with the provided field definition
values can point to either:
- an inline field definition such as
{ type: 'string', minLength: 1 } - a child
Schemainstance when every dynamic value should itself be a structured object contract
When values points to a child object schema, each dynamic value is validated in replace mode for the same reason array object items are: each value is treated as a complete object at that key.
Known Fields Plus Passthrough Extras
If you need an object with a few validated child fields but you still want to allow extra keys through unchanged, combine schema with additionalProperties: true:
const detailsSchema = createSchema({
message: { type: 'string', required: true },
fieldErrors: {
type: 'object',
values: { type: 'string', minLength: 1 },
required: false
}
})
const schema = createSchema({
details: {
type: 'object',
schema: detailsSchema,
additionalProperties: true
}
})That means:
- known child fields are validated and normalized by
detailsSchema - unknown child fields are preserved unchanged
- transport export becomes
propertiesplusadditionalProperties: true
Worked example:
const metadataSchema = createSchema({
metadata: {
type: 'object',
additionalProperties: true
}
})Valid input:
metadataSchema.patch({
metadata: {
theme: 'dark',
flags: {
beta: true
}
}
})Result:
{
validatedObject: {
metadata: {
theme: 'dark',
flags: {
beta: true
}
}
},
errors: {}
}Invalid input:
metadataSchema.patch({
metadata: ['not-an-object']
})Result:
{
validatedObject: {
metadata: ['not-an-object']
},
errors: {
metadata: {
field: 'metadata',
code: 'TYPE_CAST_FAILED',
message: 'Value could not be cast to the required type.',
params: {}
}
}
}That is the intended contract: object-ness is enforced, but the inner bag is not owned by this library.
Dotted path options for nested fields
Because nested errors use dotted paths, the opt-out options do too.
Skip a whole nested field:
workspaceViewSchema.patch({
workspace: {
slug: 'x'
}
}, {
skipFields: ['workspace.slug']
})Skip a specific nested validator:
workspaceViewSchema.patch({
workspace: {
slug: 'x'
}
}, {
skipParams: {
'workspace.slug': ['minLength']
}
})This keeps the options model flat and consistent with the error map.
Recursive Schemas
Recursive schema graphs are supported at runtime.
The important distinction is that the library follows the graph of Schema instances, not just one level of nesting. That means a field or array item can point back to the same schema instance, and the runtime will keep validating deeper paths using the same operation rules it would use for any non-recursive nested contract.
The practical setup rule is simple: self-references are usually wired after the first createSchema(...) call, because the variable must exist before another field can point at it.
Self-recursive object and array example
const nodeSchema = createSchema({
id: { type: 'string', required: true },
label: { type: 'string', required: true },
parent: { type: 'object', required: false },
children: { type: 'array', required: false }
})
nodeSchema.structure.parent.schema = nodeSchema
nodeSchema.structure.children.items = nodeSchemaThat creates two different recursive edges:
parentis a nested object field that points back tonodeSchemachildren.itemsis an array ofnodeSchemaobjects
Recursive runtime semantics
The same rules still apply inside the recursive graph:
- nested object fields such as
parentinherit the active operation contract - array items that are object schemas still use
replacesemantics - recursive errors stay in the same flat dotted-path shape as any other nested error
Example:
const patchParent = nodeSchema.patch({
parent: {
label: ' Root '
}
})
const patchChildren = nodeSchema.patch({
children: [
{ label: 'Only child label' }
]
})patchParent succeeds with:
{
validatedObject: {
parent: {
label: 'Root'
}
},
errors: {}
}That happens because parent is a nested object field and inherits the outer patch operation.
patchChildren returns:
{
validatedObject: {
children: [
{
label: 'Only child label'
}
]
},
errors: {
'children.0.id': {
field: 'children.0.id',
code: 'REQUIRED',
message: 'Field is required',
params: {}
}
}
}That happens because array items that point to object schemas are always treated as full replacements.
Recursive paths, introspection, and transport export
Recursive schemas keep the same dotted-path model everywhere else too:
nodeSchema.getFieldDefinition('children.0.label')resolves correctlynodeSchema.validateAt('children.0.label', payload)validates only that selected path- recursive transport export is graph-aware rather than stack-recursive
toJsonSchema() uses draft-07 definitions plus $ref for recursive nested contracts, and direct self-recursive object fields point back to #. The transport-specific details are covered again in the Transport JSON Schema Export chapter below.
Path-Scoped Validation for Forms and Interactive UIs
Full-object validation is still the right tool for submit boundaries:
const result = userSchema.create(payload)But forms often need something narrower:
- validate one field on blur
- validate a small step in a wizard
- normalize only the field the user just touched
- avoid triggering unrelated sibling errors while the user is still editing
That is what validateAt() and validatePaths() are for.
validateAt(path, object, options)
Use validateAt() when you want one path.
const profileSchema = createSchema({
name: { type: 'string', required: true, minLength: 3 },
role: { type: 'string', defaultTo: 'guest' }
})
profileSchema.validateAt('name', {
name: ' Alex '
})Result:
{
validatedValue: 'Alex',
errors: {}
}By default, path validation uses patch semantics. That means:
- only the selected path is validated
- missing required siblings do not produce errors
- defaults do not apply unless you explicitly choose an operation that applies them
If you want create-style or replace-style behavior for the exact selected path, pass operation.
profileSchema.validateAt('role', {}, { operation: 'create' })Result:
{
validatedValue: 'guest',
errors: {}
}If you want required checks for the exact selected field:
profileSchema.validateAt('name', {}, { operation: 'create' })Result:
{
validatedValue: undefined,
errors: {
name: {
field: 'name',
code: 'REQUIRED',
message: 'Field is required',
params: {}
}
}
}Nested path example
This is where path-scoped validation becomes most useful.
const workspaceSummarySchema = createSchema({
id: { type: 'id', required: true },
slug: { type: 'string', required: true, minLength: 3 },
ownerUserId: { type: 'id', required: true }
})
const workspaceSchema = createSchema({
workspace: {
type: 'object',
required: true,
schema: workspaceSummarySchema
}
})Validate only workspace.slug:
workspaceSchema.validateAt('workspace.slug', {
workspace: {
slug: ' primary '
}
}, {
operation: 'create'
})Result:
{
validatedValue: 'primary',
errors: {}
}Notice what did not happen:
workspace.idwas not requiredworkspace.ownerUserIdwas not required- unrelated nested keys were not validated
That is the point of the API. It validates the selected path, not the whole object.
If you select the whole object path instead:
workspaceSchema.validateAt('workspace', {
workspace: {
slug: ' primary '
}
}, {
operation: 'create'
})Result:
{
validatedValue: {
slug: 'primary'
},
errors: {
'workspace.id': {
field: 'workspace.id',
code: 'REQUIRED',
message: 'Field is required',
params: {}
},
'workspace.ownerUserId': {
field: 'workspace.ownerUserId',
code: 'REQUIRED',
message: 'Field is required',
params: {}
}
}
}That distinction is intentional:
- selecting
workspace.slugvalidates one field - selecting
workspacevalidates the whole nested object contract
validatePaths(paths, object, options)
Use validatePaths() when you want a subset of fields or a whole form step.
const stepSchema = createSchema({
workspace: {
type: 'object',
schema: workspaceSummarySchema
},
status: { type: 'string', defaultTo: 'draft' }
})
stepSchema.validatePaths([
'workspace.slug',
'status'
], {
workspace: {
slug: ' next '
}
}, {
operation: 'create'
})Result:
{
validatedObject: {
workspace: {
slug: 'next'
},
status: 'draft'
},
errors: {}
}This is useful for:
- wizard-step validation
- validating only dirty fields
- validating a form section before moving on
Path options and compatibility
Path-scoped validation supports the same flat nested option model:
workspaceSchema.validatePaths([
'workspace.slug'
], {
workspace: {
slug: 'x'
}
}, {
operation: 'patch',
skipParams: {
'workspace.slug': ['minLength']
}
})mode also works as compatibility sugar for the built-in operations:
workspaceSchema.validateAt('workspace.slug', values, { mode: 'patch' })Form integration guidance
These APIs are meant to help form adapters, but the library still does not become a form framework.
Recommended approach:
- keep raw input state in the UI while the user is typing
- use
validateAt()orvalidatePaths()to compute errors and normalized values - apply full normalization on submit with
create(),replace(), orpatch()
That matters because aggressive normalization during typing can be annoying:
- trimming on every keypress can move the cursor
- number coercion can fight half-finished input such as
12. - nested defaults can appear before the user has actually submitted anything
So the intended split is:
- interactive validation:
validateAt()/validatePaths() - submit boundary validation:
create()/replace()/patch()
React Hook Form Resolver
If you use React Hook Form, this package now ships a dedicated resolver adapter as a separate subpath export:
import { useForm } from 'react-hook-form'
import { createSchema } from 'json-rest-schema'
import { jsonRestSchemaResolver } from 'json-rest-schema/react-hook-form'That import path is intentional. The resolver lives outside the main schema engine so the core library does not become React-specific.
Basic usage
const profileSchema = createSchema({
name: { type: 'string', required: true, minLength: 3 },
role: { type: 'string', defaultTo: 'guest' }
})
const form = useForm({
resolver: jsonRestSchemaResolver(profileSchema)
})By default, the resolver uses create semantics for full-form validation.
That means:
- required fields are enforced
- defaults are applied on successful full-form validation
- the resolver itself returns normalized success values for full-form validation
So if the user submits:
{
name: ' Alex '
}the resolver will hand React Hook Form a successful value object equivalent to:
{
name: 'Alex',
role: 'guest'
}One real-world nuance matters here: React Hook Form still owns its internal field state. In practice, that means a successful resolver pass does not always mean your submit handler receives a canonical normalized payload directly from RHF's state.
If you need a final REST-ready payload, run one last schema operation in the submit handler:
const form = useForm({
resolver: jsonRestSchemaResolver(profileSchema)
})
const onSubmit = rawValues => {
const { validatedObject, errors } = profileSchema.create(rawValues)
if (Object.keys(errors).length > 0) return
saveProfile(validatedObject)
}That split is intentional:
- RHF keeps raw interactive field state
- the schema owns final normalization at the submit boundary
- the UI is free to avoid aggressive value rewriting while the user is typing
Edit forms and custom operations
If the form is editing an existing resource, use a different operation explicitly.
For a patch-style form:
const form = useForm({
resolver: jsonRestSchemaResolver(profileSchema, {
operation: 'patch'
})
})You can also use any custom operation you have registered on the schema:
const form = useForm({
resolver: jsonRestSchemaResolver(profileSchema, {
operation: 'upsert'
})
})Field-level re-validation behavior
React Hook Form re-validates one field at a time during user interaction. The resolver uses the core path APIs for that subset validation.
Important behavior:
- only the selected RHF field names are validated during field-level re-validation
- sibling required fields do not leak into a single-field re-validation pass
- by default, field-level re-validation keeps raw form values instead of forcing normalized values back into the UI while the user is typing
That default matters because aggressive normalization during typing can feel bad:
- trimmed strings can move the cursor
- number coercion can fight half-complete input
- defaults can appear before submit
Opting into normalized field-level values
If you explicitly want normalized field values during field-level re-validation, opt in:
const form = useForm({
resolver: jsonRestSchemaResolver(
profileSchema,
{},
{ normalizeOnFieldValidation: true }
)
})This is opt-in on purpose.
Returning raw values on success
If you want successful resolver results to return raw input values instead of normalized values, use raw: true:
const form = useForm({
resolver: jsonRestSchemaResolver(
profileSchema,
{},
{ raw: true }
)
})That applies to successful full-form validation too, so defaults and casts are not pushed into the returned values object.
Error shape
React Hook Form requires hierarchical nested errors for deep paths. The resolver converts the library's flat dotted-path errors into the structure RHF expects.
For example, a schema error like:
{
'roles.0.label': {
field: 'roles.0.label',
code: 'REQUIRED',
message: 'Field is required',
params: {}
}
}becomes a React Hook Form error shape equivalent to:
{
roles: [
{
label: {
type: 'REQUIRED',
message: 'Field is required'
}
}
]
}Direct array-field errors are placed under RHF's root key for that field array path.
Native browser validation
The resolver also respects React Hook Form's shouldUseNativeValidation option. If RHF asks for native validation, the adapter sets setCustomValidity() / reportValidity() using the schema error messages.
Vue Form Adapter
If you use Vue, this package now ships a small adapter layer as a separate subpath export:
import { useSchemaForm, useSchemaField } from 'json-rest-schema/vue'That split is intentional.
json-rest-schema/vuehandles schema-aware form orchestration- the core schema engine stays framework-agnostic
Just as important: the adapter does not import Vue internally.
They work with:
- plain objects
- Vue reactive proxies
- Vue refs such as
ref({ ... })
That keeps the published package small and avoids turning Vue into a hard dependency of the core runtime.
Basic Vue usage
Use useSchemaForm() when you already own the form values in Vue state.
import { reactive } from 'vue'
import { createSchema } from 'json-rest-schema'
import { useSchemaForm } from 'json-rest-schema/vue'
const profileSchema = createSchema({
name: { type: 'string', required: true, minLength: 3 },
role: { type: 'string', defaultTo: 'guest' }
})
const values = reactive({
name: ''
})
const form = useSchemaForm(profileSchema, {
values
})If you want Vue to react to adapter-managed error or result updates, pass Vue-owned
containers such as ref({}), reactive({}), or ref(null):
import { reactive, ref } from 'vue'
const values = reactive({
name: ''
})
const errors = ref({})
const lastResult = ref(null)
const form = useSchemaForm(profileSchema, {
values,
errors,
lastResult
})That keeps reactivity in the Vue app instead of hiding framework state inside the schema library.
Important behavior:
- full-form validation defaults to
createsemantics form.validate()returns the usual{ validatedObject, errors }form.errorsstays in the library's flat dotted-path formatform.nestedErrorsgives you the nested object/array form if your Vue layer prefers it
Running a full validation:
const result = form.validate()If values is:
{
name: ' Alex '
}then result will be:
{
validatedObject: {
name: 'Alex',
role: 'guest'
},
errors: {}
}That is the same contract as the core schema engine. The Vue adapter does not invent a second validation format.
Field-level validation in Vue
For blur validation, wizard steps, or one-field re-validation, use the path-aware helpers.
const fieldResult = form.validateField('name')
const stepResult = form.validateFields(['name', 'role'])This matters because the adapter validates only the selected paths.
That means:
- validating
namedoes not suddenly produceemailorpassworderrors - nested paths such as
workspace.slugwork the same way as they do in the core APIs - bracket paths such as
roles[0].labelare accepted too
If you want a path-focused helper object, use useSchemaField():
const nameField = useSchemaField(form, 'name')It gives you:
nameField.valuenameField.errornameField.hasErrornameField.messagenameField.messagesnameField.validate()nameField.clearError()
Example:
nameField.validate()
console.log(nameField.messages)Submit normalization in Vue
The clean submit path is:
const submitProfile = form.submit((validatedObject) => {
return api.saveProfile(validatedObject)
})submit() always validates first.
If validation fails:
- the handler is not called
- the returned value is the validation result
form.errorsis updated
If validation succeeds:
- the handler receives the normalized
validatedObject - defaults and casts are already applied
This keeps the same intended split as the rest of the library:
- raw values while the user is typing
- normalized values at the submit boundary
Edit forms and custom operations in Vue
If the form is editing an existing resource, choose a different operation explicitly:
const form = useSchemaForm(profileSchema, {
values,
operation: 'patch'
})You can also use a custom schema operation:
const form = useSchemaForm(profileSchema, {
values,
operation: 'upsert'
})The adapter routes everything back through the schema operation registry, so custom operations behave the same way here as they do in the core runtime.
If you render those forms with Vuetify, use the separate bridge below. It stays thin on purpose and translates the Vue adapter's existing validation results into Vuetify-friendly props and rule callbacks.
Vuetify Bridge
If you use Vuetify on top of the Vue adapter, import the bridge from its own subpath:
import { createVuetifyRule, fieldProps, getVuetifyErrorMessages } from 'json-rest-schema/vuetify'This split is intentional:
json-rest-schema/vueowns schema-aware form orchestrationjson-rest-schema/vuetifytranslates those results into Vuetifyrulesanderror-messages- the validation rules still live in the schema layer, not in component glue code
Vuetify rules integration
Vuetify's rules prop is a natural fit for path-scoped validation.
const slugRule = createVuetifyRule(form, 'workspace.slug')Then bind it to a component:
<v-text-field
v-model="values.workspace.slug"
:rules="[slugRule]"
/>That rule:
- clones the current form values
- injects the field's current candidate value at the selected path
- runs
validateField(path, ...) - returns either
trueor the schema error message
So the rule stays a thin bridge. It does not re-implement validation logic.
Vuetify fieldProps() helper
If you want a compact helper for Vuetify inputs, use fieldProps():
const slugProps = fieldProps(form, 'workspace.slug')Then:
<v-text-field
v-model="values.workspace.slug"
v-bind="slugProps"
/>By default, fieldProps() returns only a rules array.
That default is deliberate. Vuetify merges error-messages with rule-generated messages, so returning both by default would duplicate the same message on screen.
If you explicitly want manual error-messages too, opt in:
const slugProps = fieldProps(form, 'workspace.slug', {
includeErrorMessages: true
})That adds:
errorMessageserror
on top of the generated rules.
Manual Vuetify error messages
If you only want the message bridge without generated rules, use getVuetifyErrorMessages() directly:
const messages = getVuetifyErrorMessages(form, 'workspace.slug')Then:
<v-text-field
v-model="values.workspace.slug"
:error-messages="getVuetifyErrorMessages(form, 'workspace.slug')"
/>This is useful when:
- you validate on submit instead of on blur/input
- you already ran
form.validate()orform.validateField() - you want Vuetify to display stored schema errors without re-running rules immediately
Worked Vue + Vuetify example
import { reactive } from 'vue'
import { createSchema } from 'json-rest-schema'
import { useSchemaForm, useSchemaField } from 'json-rest-schema/vue'
import { fieldProps } from 'json-rest-schema/vuetify'
const workspaceSummarySchema = createSchema({
id: { type: 'id', required: true },
slug: { type: 'string', required: true, minLength: 3 },
ownerUserId: { type: 'id', required: true }
})
const workspaceSchema = createSchema({
workspace: {
type: 'object',
required: true,
schema: workspaceSummarySchema
}
})
const values = reactive({
workspace: {
slug: ''
}
})
const form = useSchemaForm(workspaceSchema, {
values,
operation: 'patch'
})
const slugField = useSchemaField(form, 'workspace.slug')
const slugProps = fieldProps(form, 'workspace.slug')
const saveWorkspace = form.submit(async (validatedObject) => {
await api.saveWorkspace(validatedObject)
})<template>
<v-form @submit.prevent="saveWorkspace">
<v-text-field
v-model="values.workspace.slug"
label="Workspace slug"
v-bind="slugProps"
@blur="slugField.validate()"
/>
<v-btn type="submit">Save</v-btn>
</v-form>
</template>That example preserves the intended layering:
- the schema owns normalization and validation
- Vue owns local form state
- Vuetify owns rendering and input UX
- submit handlers own business logic and API calls
Demo Apps and Browser Smoke Tests
This repo now includes two minimal demo apps documented in demos/README.md:
demos/react-rhfdemos/vue-vuetify
They alias package imports back to the local source files in this checkout, so they always exercise the current repo state instead of a published npm copy.
What each demo proves:
demos/react-rhf: the React Hook Form resolver works in a real browser app, and the submit flow can still perform one final canonical schema pass before handing the payload to your API layer.demos/vue-vuetify: the Vue and Vuetify adapters work in a real browser app, including visible Vuetify controls, blur validation, and normalized submit output.
To install and run them:
npm run demo:installThen in separate terminals:
npm run demo:react
npm run demo:vueVite will print the local URLs it chose. If the default port is busy, it will pick the next open one automatically.
To run the browser smoke tests:
npx playwright install chromium
npm run test:demosThe Playwright coverage is intentionally small and concrete:
- the React demo validates through the RHF resolver and performs one final canonical schema submit
- the Vue demo validates through the Vue and Vuetify adapters and submits a normalized payload in a real browser runtime
Small troubleshooting notes:
- If a Vuetify control appears blank, make sure you ran the Vue demo's local install step. The demo now declares and imports the Material Design icon font explicitly.
- If Playwright complains about missing browsers, run
npx playwright install chromiumonce from the repo root.
VeeValidate v5 Bridge
VeeValidate v5 accepts Standard Schema-compatible validators as validationSchema.
That means json-rest-schema does not need a heavy VeeValidate-specific runtime adapter. This package now ships a small bridge that wraps a schema instance in the Standard Schema interface VeeValidate already understands.
Import it like this:
import { useForm } from 'vee-validate'
import { createSchema } from 'json-rest-schema'
import { toVeeValidateSchema } from 'json-rest-schema/vee-validate'Basic usage
const profileSchema = createSchema({
name: { type: 'string', required: true, minLength: 3 },
role: { type: 'string', defaultTo: 'guest' }
})
const { handleSubmit, errors, values } = useForm({
initialValues: {
name: ''
},
validationSchema: toVeeValidateSchema(profileSchema)
})That default bridge uses create semantics.
So on successful submit:
- required fields are enforced
- normalized values are returned
- defaults are applied to the submitted output
If the user submits:
{
name: ' Alex '
}then the validated submit payload is equivalent to:
{
name: 'Alex',
role: 'guest'
}Edit forms and custom operations
If the form is editing an existing resource, pass the operation explicitly:
const { handleSubmit } = useForm({
initialValues,
validationSchema: toVeeValidateSchema(profileSchema, {
operation: 'patch'
})
})Custom operations work too:
const { handleSubmit } = useForm({
initialValues,
validationSchema: toVeeValidateSchema(profileSchema, {
operation: 'upsert'
})
})Important VeeValidate limitation: defaults do not initialize form state
This is important enough to say clearly:
- the bridge validates and normalizes the schema output
- VeeValidate still expects you to provide your own
initialValues - schema defaults do not automatically populate the form's starting state
So this is the intended split:
initialValuescontrols the raw form statetoVeeValidateSchema(...)controls validation and normalized submit output
If you want a default field visible in the UI before submit, put it in initialValues.
If you only want the normalized payload to contain the default when the user submits, let the schema apply it.
Error paths
The bridge turns the library's flat error map into Standard Schema issues with nested paths.
So an internal error like:
{
'roles.0.label': {
field: 'roles.0.label',
code: 'REQUIRED',
message: 'Field is required',
params: {}
}
}becomes Standard Schema issues equivalent to:
[
{
message: 'Field is required',
path: ['roles', 0, 'label']
}
]That is what lets VeeValidate map nested array/object errors back onto the right field state.
Worked VeeValidate example
import { useForm } from 'vee-validate'
import { createSchema } from 'json-rest-schema'
import { toVeeValidateSchema } from 'json-rest-schema/vee-validate'
const workspaceSummarySchema = createSchema({
id: { type: 'id', required: true },
slug: { type: 'string', required: true, minLength: 3 },
ownerUserId: { type: 'id', required: true }
})
const workspaceSchema = createSchema({
workspace: {
type: 'object',
required: true,
schema: workspaceSummarySchema
}
})
const { defineField, handleSubmit, errors } = useForm({
initialValues: {
workspace: {
slug: ''
}
},
validationSchema: toVeeValidateSchema(workspaceSchema, {
operation: 'patch'
})
})
const [slug, slugAttrs] = defineField('workspace.slug')
const saveWorkspace = handleSubmit((validatedObject) => {
return api.saveWorkspace(validatedObject)
})<template>
<form @submit.prevent="saveWorkspace">
<input v-model="slug" v-bind="slugAttrs">
<span>{{ errors['workspace.slug'] }}</span>
<button type="submit">Save</button>
</form>
</template>This keeps the responsibilities clean:
- VeeValidate owns touched/dirty/submit orchestration
json-rest-schemaowns validation and normalization- your submit handler owns business logic
Transport JSON Schema Export
json-rest-schema can also export a transport-facing JSON Schema document for adapters that want pre-handler validation.
const userSchema = createSchema({
id: { type: 'id', required: true },
email: { type: 'string', required: true },
age: { type: 'number', min: 18, defaultTo: 18 },
status: { type: 'string', enum: ['draft', 'published'] }
})
const createTransportSchema = userSchema.toJsonSchema()
const patchTransportSchema = userSchema.toJsonSchema({ operation: 'patch' })Key points:
- Draft: exports draft-07 JSON Schema.
- Operation-aware:
operation: '<name>'controls therequiredlist and whetherdefaultTois emitted. - Compatibility:
mode: 'create' | 'replace' | 'patch'still works as shorthand for the built-in operations. - Transport-facing: the export is intended for JSON/Ajv/Fastify-style request validation, not for reproducing every in-process coercion path.
- Strict field shape:
additionalPropertiesdefaults tofalsebecause runtime validation rejects unknown fields. Override it withtoJsonSchema({ additionalProperties: true })if needed. - Nested export: schema-backed nested object contracts are hoisted into draft-07
definitionsand referenced with$ref, so repeated and recursive graphs stay finite. Nested object fields inherit the active operation, while object schemas used as array items or object-map values are exported inreplacemode. - Recursive graph support: runtime validation and
toJsonSchema()both support self-recursive schema graphs. Direct self-recursive object fields point back to#, while recursive nested contracts are emitted throughdefinitions. - Opaque bags stay opaque:
type: 'object'plusadditionalProperties: trueexports as a permissive object field and does not invent child property rules. - Single source of truth: only rules owned by
json-rest-schemaare exported. External metadata keys from other layers are ignored. - Passive metadata preserved: schema-owned passive metadata such as
precision,scale,unsigned, andtemporalPrecisionis preserved underx-json-rest-schema.metadata. - Custom rules: if a custom type or validator needs transport export support, attach a
toJsonSchema()hook to the handler. If you register a custom validator without that hook, export fails loudly instead of silently drifting.
Worked nested export example
const schema = createSchema({
workspace: {
type: 'object',
required: true,
schema: workspaceSummarySchema
},
roles: {
type: 'array',
items: roleSchema
},
metadata: {
type: 'object',
additionalProperties: true
}
})Exporting schema.toJsonSchema() gives you:
workspaceas a$refto a definition whoserequiredfields still inherit the active operationroles.itemsas a$refto a nested object definition exported inreplacemodemetadataas{ type: 'object', additionalProperties: true }- object maps as
{ type: 'object', additionalProperties: <value schema> } - passthrough nested objects as validated
propertiesplusadditionalProperties: true
That means the transport export stays aligned with runtime semantics:
- nested objects behave like nested contracts even when the runtime schema graph is recursive
- array object items behave like complete replacements
- opaque bags stay opaque instead of pretending to be structured
Worked recursive export example
Using the same recursive nodeSchema from the Recursive Schemas chapter above:
const transportSchema = nodeSchema.toJsonSchema()Key recursive export behaviors:
- a direct self-recursive object field such as
parentbecomes a reference back to# - recursive nested contracts reached through array items or dynamic object values are hoisted into
definitions - the exporter stays finite because it tracks schema graph nodes, not just call depth
For the parent field above, the transport shape is:
{
allOf: [
{
$ref: '#'
}
],
'x-json-rest-schema': {
castType: 'object'
}
}For the children.items edge, the transport shape becomes a $ref into definitions, and that referenced definition points back to itself for deeper children.items recursion.
Common REST Recipes
This section is intentionally practical. These are the shapes you are likely to define in a real API.
Recipe: create payload
const createUserSchema = createSchema({
email: { type: 'string', required: true, notEmpty: true, lowercase: true },
displayName: { type: 'string', required: true, minLength: 2 },
role: { type: 'string', defaultTo: 'member' },
marketingOptIn: { type: 'boolean', defaultTo: false }
})Use it like this:
const result = createUserSchema.create({
email: ' [email protected] ',
displayName: ' Alex '
})Result:
{
validatedObject: {
email: '[email protected]',
displayName: 'Alex',
role: 'member',
marketingOptIn: false
},
errors: {}
}Use this pattern when:
- the client is creating a new resource
- missing
requiredfields should fail - omitted defaults should be materialized
Recipe: patch payload
Use the same schema, but call patch():
const result = createUserSchema.patch({
displayName: ' Updated Name '
})Result:
{
validatedObject: {
displayName: 'Updated Name'
},
errors: {}
}Use this pattern when:
- the client is updating only a subset of fields
- missing
requiredfields should not fail just because they were omitted - defaults should not be invented during a patch
Recipe: nested detail response
This is a common “show one resource” response shape.
const userSummarySchema = createSchema({
id: { type: 'id', required: true },
email: { type: 'string', required: true }
})
const projectSummarySchema = createSchema({
id: { type: 'id', required: true },
slug: { type: 'string', required: true }
})
const projectDetailSchema = createSchema({
project: {
type: 'object',
required: true,
schema: projectSummarySchema
},
owner: {
type: 'object',
required: true,
schema: userSummarySchema
},
permissions: {
type: 'array',
required: true,
items: { type: 'string', minLength: 1 }
}
})Validate it with create() or replace() depending on your calling style:
const result = projectDetailSchema.create({
project: {
id: '10',
slug: ' api-redesign '
},
owner: {
id: '7',
email: '[email protected]'
},
permissions: ['read', 'write']
})Result:
{
validatedObject: {
project: {
id: 10,
slug: 'api-redesign'
},
owner: {
id: 7,
email: '[email protected]'
},
permissions: ['read', 'write']
},
errors: {}
}Recipe: list response envelope
This library validates objects, so for list endpoints the usual pattern is an envelope object instead of a top-level array.
const workspaceSummarySchema = createSchema({
id: { type: 'id', required: true },
slug: { type: 'string', required: true },
ownerUserId: { type: 'id', required: true }
})
const workspaceListSchema = createSchema({
items: {
type: 'array',
required: true,
items: workspaceSummarySchema
},
total: { type: 'integer', required: true, min: 0 }
})Example:
const result = workspaceListSchema.create({
items: [
{ id: '1', slug: 'alpha', ownerUserId: '7' },
{ id: '2', slug: 'beta', ownerUserId: '9' }
],
total: '2'
})Result:
{
validatedObject: {
items: [
{ id: 1, slug: 'alpha', ownerUserId: 7 },
{ id: 2, slug: 'beta', ownerUserId: 9 }
],
total: 2
},
errors: {}
}Recipe: settings or metadata bag
When part of the payload belongs to another layer and should not be field-by-field validated here, use an opaque object bag.
const updatePreferencesSchema = createSchema({
userId: { type: 'id', required: true },
preferences: {
type: 'object',
additionalProperties: true
}
})Example:
const result = updatePreferencesSchema.patch({
preferences: {
theme: 'dark',
shortcuts: {
save: 'cmd+s'
},
labs: ['new-sidebar']
}
})Result:
{
validatedObject: {
preferences: {
theme: 'dark',
shortcuts: {
save: 'cmd+s'
},
labs: ['new-sidebar']
}
},
errors: {}
}Use this only when you intentionally want:
- object-ness to be enforced
- inner keys and values to pass through untouched
- no nested validation contract owned by this library
Recipe: custom operation for an upsert-like boundary
Sometimes you want “validate the whole shape, apply defaults, but do not require every required field.”
const accountSchema = createSchema({
email: { type: 'string', required: true, lowercase: true },
role: { type: 'string', defaultTo: 'member' }
}, {
operations: {
upsert: {
targetFields: 'schema',
enforceRequired: false,
applyDefaults: true,
outputFields: 'validated'
}
}
})Example:
accountSchema.upsert({})Result:
{
validatedObject: {
role: 'member'
},
errors: {}
}This is useful when the persistence layer or surrounding business logic decides whether the resource already exists, and the schema’s job is only to normalize a shared contract.
Built-in Rules Reference
Here is a complete list of all types and validators available out of the box.
Built-in Types (Casting Rules)
A field's type defines how the input value will be converted before any other validation rules are run.
| Type Name | Description |
|---|---|
| string | Converts the input to a string. By default, it trims whitespace. Fails if the input is an object or array. |
| number | Converts the input to a finite number. Empty strings, whitespace-only strings, and non-finite values fail validation. |
| integer | Converts the input to a finite integer. Non-integer numeric values fail validation. |
| boolean| Converts the input to a boolean using explicit true/false tokens such as true, false, 1, 0, yes, no, on, and off. Unknown values fail validation. |
| array | Ensures the value is an array. If the input is not already an array, it will be wrapped in one (e.g., 'tag1' becomes ['tag1']). If items is present, every item is validated recursively. |
| id | Parses the value into a positive safe integer identifier. It rejects non-canonical forms such as leading zeroes or strings with junk suffixes. |
| date | Converts a valid date string or timestamp into a Date object normalized to midnight UTC for that calendar day. |
| dateTime| Converts a valid date string or timestamp into a Date object. MySQL-style YYYY-MM-DD HH:MM:SS strings are interpreted as UTC. |
| timestamp| Converts the input to a number, suitable for storing Unix timestamps. |
| time | Converts the input to a normalized HH:MM:SS string. |
| serialize| Converts any JavaScript value (including objects with circular references) into a single JSON-like string using flatted. |
| object | Requires a plain object value. With schema, it becomes a nested object contract. With values, it becomes a typed object map. With additionalProperties: true, it becomes either an opaque pass-through object bag or a passthrough nested object when combined with schema. Without any of those options, it is simply a validated plain object value with no child-field rules. |
| blob | Passes the value through unchanged. Intended for binary data like files that don't need casting. |
| file | Converts primitive file-handle-like values to strings and rejects objects or arrays. |
| none | The "identity" type. Passes the value through completely unchanged without any casting. |
Built-in Validators (Validation Parameters)
Validators are rules that run after a value has been cast to its proper type.
| Parameter | Description |
|---|---|
| required: true | The field must be present in the input object. Fails if the key is undefined. |
| minLength: <number> | For string types, validates the minimum character length. |
| maxLength: <number> | For string types, validates the maximum character length. |
| min: <number> | For number types, validates the minimum value. |
| max: <number> | For number types, validates the maximum value. |
| enum: <array> | Restricts the field to one of the declared values. Exported as a standard JSON Schema enum. |
| notEmpty: true | The field cannot be an empty string (''). This is different from required, as an empty string is still a defined value. |
| length: <number>| For string types, it truncates the string to the specified length. For number types, it throws an error if the number of digits in the original input exceeds the specified length. |
| nullable: true| Allows the value for this field to be null. By default, null is not allowed. |
| nullOnEmpty: true| If the input value is an empty string (''), it will be cast to null before other validators run. |
| lowercase: true | Transforms the string to all lowercase. |
| uppercase: true | Transforms the string to all uppercase. |
| strictBoolean: true | Restricts a boolean field so the original input must already be a real boolean. |
| validator: <function>| Allows you to provide your own synchronous custom validation function for complex, one-off logic. |
| defaultTo: <value> | If the field is not present in the input object, this value will be used in validation modes that apply defaults. Can be a value or a function that returns a value. |
| unsigned: true | Passive schema metadata indicating non-negative numeric storage intent. Preserved in transport export metadata. |
| precision: <number> | Passive schema metadata for decimal total digits. Preserved in transport export metadata. |
| scale: <number> | Passive schema metadata for decimal fractional digits. Preserved in transport export metadata. |
| temporalPrecision: <number> | Passive schema metadata for time or datetime fractional-second precision. Preserved in transport export metadata. |
Extending the Library: Custom Rules
The real power of the library comes from its extensibility. You can easily add your own reusable types and validators. They must stay synchronous so the schema remains portable across environments. When you do this, you'll be passed a powerful context object.
The context Object
Every custom type and validator handler receives a context object as its only argument. This object is your toolbox, giving you all the information you need to perform complex logic. Here are its properties:
value: The current value of the field being processed. Be aware that this value may have already been changed by the type handler or a previous validator.fieldName: A string containing the name of the field currently being validated (e.g.,'username').object: The entire object that is being validated. Its properties reflect the data after any casting or transformations have been applied up to this point. This is useful for cross-field validation.valueBeforeCast: The original, raw value for the field, exactly as it was in the input object before any type casting occurred.objectBeforeCast: The original, raw input object, before any modifications were made.definition: The schema definition object for the current field. For a field defined as{ type: 'string', min: 5 }, this would be that exact object.parameterName: (For validators only) The name of the validation rule currently being executed (e.g.,'min').parameterValue: (For validators only) The value of the validation rule currently being executed (e.g., the5inmin: 5).mode: The active validation contract. Preserved as a compatibility alias foroperation.operation: The active validation contract name (for example'create','patch', or a custom operation such as'upsert').fieldPresent: A boolean indicating whether the field was explicitly present in the original input object.throwTypeError(): A function you can call to throw a standardizedTYPE_CAST_FAILEDerror. This is the preferred way to report an error from within a type handler.throwParamError(code, message, params): A function you can call to throw a standardized validation error from within a validator. It accepts a custom errorcode, amessage, and an optionalparamsobject.
Creating a Custom Validator
Let's say you frequently need to validate that a field is a URL-friendly "slug" (e.g., my-blog-post).
You can define a new validator once and use it anywhere.
C
