npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@ai-universe/mcp-server-utils

v1.3.2

Published

Shared MCP server utilities and TypeScript server launcher for AI Universe services

Readme

@ai-universe/mcp-server-utils

Comprehensive utilities for building and operating FastMCP servers in production. The package bundles HTTP proxy helpers, Express middleware wiring, lifecycle management, port discovery, CORS hardening, and ready-to-use tool scaffolding so client teams can focus on business logic instead of server plumbing.

Table of contents

  1. When to use this package
  2. Key capabilities
  3. Installation
  4. Quick start: ship an HTTP-accessible MCP server
  5. Package modules at a glance
  6. HTTP proxy helper
  7. Port discovery utilities
  8. Wildcard-aware CORS configuration
  9. Server factory & agent integration
  10. Server lifecycle management
  11. Tool helper catalog
  12. TypeScript Server Launcher
  13. Bundled Conversation MCP client
  14. Logging, error handling, and security notes
  15. TypeScript reference
  16. Testing your integration
  17. Troubleshooting & FAQs
  18. Requirements
  19. License

When to use this package

Choose @ai-universe/mcp-server-utils whenever you need:

  • A drop-in HTTP proxy so any FastMCP server can be consumed over HTTP(S) without clients speaking STDIO.
  • Safe CORS handling that supports wildcard domains (e.g., https://*.mydomain.com) while preventing regex injection.
  • Predictable port assignment across developer machines and containers.
  • Lifecycle orchestration that wraps startup/shutdown hooks and handles POSIX signals.
  • A library of battle-tested tool helpers for health checks, version reporting, and diagnostics.

Key capabilities

  • TypeScript Server Launcher - Generic framework for launching TypeScript servers locally with automated port management, process control, dependency installation, builds, and watch mode support.
  • HTTP proxy server with Express wiring and JSON HTTP stream transport defaults.
  • Port management with bounded search windows, randomized selection, and error reporting for exhaustion conditions.
  • Process management with PID tracking, graceful termination, port reclamation, and parent process detection.
  • Build orchestration with dependency checking, TypeScript compilation, and custom preparation hooks.
  • CORS configuration that normalizes origins, optionally allows null, and compiles wildcard expressions into safe regex patterns.
  • Server factory with agent/tool registration, initialization hooks, and health-check tooling.
  • Lifecycle manager that standardizes onStart, onStop, and onError workflows and wires graceful shutdown handlers.
  • Tool decorators to add logging and resilient error handling around existing FastMCP tools.
  • Latency tracker utilities for structured per-stage/model observability with pluggable loggers. Verbose latency logging is enabled by default; set MCP_VERBOSE_LATENCY_LOGGING=false to silence detailed metrics.
  • Bundled conversation client re-export so typed conversation helpers are available without pulling a second package.

Installation

npm install @ai-universe/mcp-server-utils fastmcp express cors

New in 1.0.2: The package now bundles the @ai-universe/mcp-conversation-client so you don't need to install it separately to consume conversation tooling helpers. Import it directly from @ai-universe/mcp-server-utils.

Optional: Install zod (or your validation library of choice) if you want schema-based parameter validation like the examples below.

Tip: The package assumes Node.js 18+ so you can rely on the built-in node:http and node:net modules. Install fastmcp 3.16.0 or later for HTTP stream transport support.

Quick start: ship an HTTP-accessible MCP server

import {
  createHealthCheckTool,
  createMCPServer,
  createWildcardCorsOptions,
  startFastMcpHttpProxy,
} from '@ai-universe/mcp-server-utils';

async function main() {
  // 1. Compose tools and agents with the server factory
  const server = await createMCPServer({
    name: 'sample-mcp',
    version: '1.2.0',
    instructions: 'Sample MCP server for demos',
    tools: [createHealthCheckTool('1.2.0')],
    onInitialize: async (mcp) => {
      // custom warm-up logic, logging, or registrations
    },
  });

  // 2. Configure CORS with wildcard origin support
  const corsOptions = createWildcardCorsOptions({
    allowedOrigins: [
      'https://*.mydomain.com',
      'http://localhost:*',
    ],
    allowNullOrigin: true,
  });

  // 3. Expose the FastMCP server via an HTTP proxy
  const { expressServer, mcpPort } = await startFastMcpHttpProxy({
    server,
    listenPort: 8080,
    host: '0.0.0.0',
    corsOptions,
    configureApp: (app) => {
      app.get('/health', (_req, res) => res.json({ status: 'ok' }));
    },
  });

  console.log(`MCP transport bound to localhost:${mcpPort}`);
  console.log('HTTP proxy listening on http://0.0.0.0:8080');

  // 4. Optional: wire graceful shutdown
  process.on('SIGINT', async () => {
    expressServer.close();
    await server.stop?.();
    process.exit(0);
  });
}

void main().catch((error) => {
  console.error(error);
  process.exitCode = 1;
});

This example:

  1. Uses createMCPServer to standardize tool and agent registration.
  2. Builds a strict yet flexible CORS policy with wildcards.
  3. Launches the HTTP proxy, which internally spins up a FastMCP HTTP stream on a reserved loopback port.
  4. Adds a custom Express health endpoint while keeping MCP traffic under /mcp (the default proxy path).

Package modules at a glance

| Module | Purpose | Highlights | | ------ | ------- | ---------- | | startFastMcpHttpProxy | Spin up Express + FastMCP HTTP transport | Configurable host/port, pluggable logger, overridable HTTP module for tests. | | findAvailablePort | Determine a free TCP port | Searches up or down with hard bounds and informative errors. | | createWildcardCorsOptions | Build wildcard-aware CORS options | Normalizes origins, supports null, compiles regex safely. | | wildcardToSafeRegex | Convert * patterns into safe regex | Escapes all metacharacters and collapses repeated wildcards. | | createMCPServer (ServerFactory) | Factory for FastMCP servers | Registers tools/agents, runs async initialization hooks. | | ServerLifecycleManager | Manage start/stop hooks | Tracks running state, propagates errors, integrates graceful shutdown. | | ToolHelpers | Ready-made tools & decorators | Echo, version, status, health check tools plus logging/error wrappers. | | LatencyTracker | Capture and persist per-stage/model metrics | Structured logging hooks, snapshotting, and optional /tmp persistence for offline analysis. | | ConversationMCPClient (re-export) | Typed MCP client for convo servers | Ships from @ai-universe/mcp-conversation-client for conversation CRUD, replies, and health checks. |

HTTP proxy helper

startFastMcpHttpProxy(options) launches two servers under the hood:

  1. FastMCP HTTP stream – started on the reserved loopback port (defaults to 45000). You can opt-in to fallback scanning if the port is already in use.
  2. Express proxy – listens on your specified host/port and forwards /mcp traffic (or any custom proxyPath) to the internal transport.

Options

| Option | Type | Default | Description | | ------ | ---- | ------- | ----------- | | server | FastMCP | required | The MCP instance to expose. | | listenPort | number | required | External Express port. | | host | string | 'localhost' | Binding host/IP. | | corsOptions | CorsOptions | undefined | Pass CORS middleware configuration (use createWildcardCorsOptions). | | proxyPath | string | '/mcp' | HTTP path that proxies to the MCP server. | | logger | LoggerLike | undefined | Optional logger with info/error methods for startup and failure events. | | startPort | number | 45000 | Seed port used when fallback scanning is enabled. | | loopbackPort | number | process.env.FASTMCP_LOOPBACK_PORT \|\| 45000 | Reserved loopback port to attempt first. | | allowLoopbackFallback | boolean | false | When true, falls back to scanning for a free port if the reserved port is busy. | | portSearchDirection | 'up' \| 'down' | 'down' | Direction to look for free ports when fallback scanning is active. | | httpModule | typeof http | node:http | Injectable for testing with undici or mocks. | | configureApp | (app) => void | undefined | Callback to register custom Express middleware/routes before proxy wiring. |

Return value

A promise resolving to { app, expressServer, mcpPort }, allowing you to add further middleware, close the Express server, or inspect which internal port was allocated.

Error handling

  • Proxy failures emit a 502 JSON response with diagnostic details and log to logger.error if provided.
  • Request stream errors close the upstream proxy and invoke the logger for easier observability.
  • Express server errors bubble via the same logger so you can integrate with Datadog, Cloud Logging, etc.

Port discovery utilities

findAvailablePort(startPort, direction) searches deterministically for an open TCP port.

  • The search is bounded to avoid endless loops: it stops after 1000 attempts or reaching the min/max thresholds.
  • For downward searches, ports lower than 2000 short-circuit with a descriptive error; upward searches cap at startPort + 1000.
  • Ports are verified by opening a transient node:net server; a 10 ms delay ensures the OS releases the port before reuse.

Use it directly when orchestrating multiple MCP servers or when deploying inside container orchestrators that need deterministic port windows.

Wildcard-aware CORS configuration

createWildcardCorsOptions(params) produces a cors middleware configuration with security-focused defaults.

  • Origins are trimmed and deduplicated, with trailing slashes removed for consistent comparisons.
  • Origins with * convert to safe regex via wildcardToSafeRegex, which escapes all regex metacharacters and collapses repeated wildcards to mitigate ReDoS risks.
  • Supports explicit method/header lists, credentials: true, caching (maxAge), and custom optionsSuccessStatus for legacy browsers.

Example: lock down staging origins

const corsOptions = createWildcardCorsOptions({
  allowedOrigins: [
    'https://*.staging.example.com',
    'https://portal.example.com',
  ],
  allowNullOrigin: false,
  methods: ['GET', 'POST'],
  allowedHeaders: ['Content-Type', 'X-Session-Id'],
});

Server factory & agent integration

createMCPServer(options) provides a declarative way to assemble your FastMCP server.

  • Accepts tools arrays directly; each tool is registered in the order provided.
  • Supports agents implementing a simple register(server) contract so teams can encapsulate their own tool bundles.
  • onInitialize allows asynchronous setup (e.g., database connections, vector store warm-up) after registration but before exposure.
  • Ships with createHealthCheckTool(version, additionalInfo) for standardized health responses including ISO timestamps and version metadata.

Agent interface sketch

class WeatherAgent implements Agent {
  constructor(private readonly tool: FastMCP.Tool) {}

  async register(server: FastMCP) {
    server.addTool(this.tool);
  }
}

Server lifecycle management

Use ServerLifecycleManager to coordinate startup/shutdown logic and ensure hooks are respected.

import { ServerLifecycleManager, setupGracefulShutdown } from '@ai-universe/mcp-server-utils';

const lifecycle = new ServerLifecycleManager(server, {
  onStart: async (svc) => svc.start({ transportType: 'stdio' }),
  onStop: async () => console.log('Stopped!'),
  onError: async (error) => console.error('Lifecycle error', error),
});

await lifecycle.start();
setupGracefulShutdown(lifecycle);
  • start() and stop() enforce idempotency, throwing if you double-start and short-circuiting if you double-stop.
  • Hooks execute sequentially and any thrown error is routed through onError before rethrow, letting you centralize alerting.
  • setupGracefulShutdown listens for SIGINT, SIGTERM, and SIGQUIT, logging progress and exiting with the proper code based on shutdown success.

TypeScript Server Launcher

The TypeScriptServerLauncher is a generic, production-ready framework for launching TypeScript servers locally with comprehensive automation and reliability features. Perfect for development environments and CI/CD pipelines.

Features

  • Smart Port Management: Preferred port with automatic fallback, randomized search to avoid thundering herd, and forced port reclamation
  • Process Control: PID file tracking, graceful/forceful termination, parent process detection, and orphan cleanup
  • Build Automation: Dependency checking, TypeScript compilation, custom prepare scripts, and shared library preparation
  • Development Workflow: Watch mode with auto-restart, production mode for testing, retry logic for port conflicts (EADDRINUSE)
  • Lifecycle Hooks: Custom callbacks at build, start, and exit events

Quick Example

import { TypeScriptServerLauncher } from '@ai-universe/mcp-server-utils';

const launcher = new TypeScriptServerLauncher({
  projectRoot: '/path/to/project',
  serverDir: 'backend',
  devCommand: ['npm', 'run', 'dev'],
  startCommand: ['npm', 'run', 'start'],
  preferredPort: 2000,
  watchMode: true,
  onServerStarted: async (port, pid) => {
    console.log(`✅ Server started on port ${port} with PID ${pid}`);
  }
});

await launcher.launch();

Configuration Options

interface ServerLauncherConfig {
  // Required
  serverDir: string;              // Server directory (e.g., 'backend')
  devCommand: string[];           // Dev mode command (e.g., ['npm', 'run', 'dev'])
  startCommand: string[];         // Production command (e.g., ['npm', 'run', 'start'])

  // Optional - Paths
  projectRoot?: string;           // Project root (defaults to process.cwd())
  pidFile?: string;               // PID file path (defaults to '.local-server.pid')
  prepareScriptPath?: string;     // Custom prepare script

  // Optional - Port Configuration
  preferredPort?: number;         // Preferred port (defaults to 2000)
  minPort?: number;               // Min port for fallback (defaults to 2000)
  maxPort?: number;               // Max port for fallback (defaults to 2500)
  forcePort?: boolean;            // Force reclaim preferred port (defaults to false)

  // Optional - Build Configuration
  buildCommand?: string[];        // Build command (defaults to ['npm', 'run', 'build'])
  skipDependencyInstall?: boolean;// Skip npm install (defaults to false)
  skipBuild?: boolean;            // Skip build step (defaults to false)

  // Optional - Runtime
  watchMode?: boolean;            // Enable watch mode (defaults to true)
  env?: Record<string, string>;   // Custom environment variables
  maxRetries?: number;            // Max retries for EADDRINUSE (defaults to 3)

  // Optional - Lifecycle Hooks
  onBeforeBuild?: () => Promise<void> | void;
  onAfterBuild?: () => Promise<void> | void;
  onServerStarted?: (port: number, pid: number) => Promise<void> | void;
  onServerExit?: (code: number | null) => Promise<void> | void;
}

Usage Patterns

Basic Development Server

const launcher = new TypeScriptServerLauncher({
  projectRoot: __dirname,
  serverDir: 'backend',
  devCommand: ['npm', 'run', 'dev'],
  startCommand: ['npm', 'run', 'start'],
  watchMode: true
});

await launcher.launch();

Production-like Testing

const launcher = new TypeScriptServerLauncher({
  serverDir: 'backend',
  devCommand: ['npm', 'run', 'dev'],
  startCommand: ['npm', 'run', 'start'],
  watchMode: false,  // Disable auto-restart
  preferredPort: 8080,
  env: {
    NODE_ENV: 'production',
    LOG_LEVEL: 'info'
  }
});

await launcher.launch();

Custom Build Pipeline

const launcher = new TypeScriptServerLauncher({
  serverDir: 'backend',
  devCommand: ['npm', 'run', 'dev'],
  startCommand: ['npm', 'run', 'start'],
  prepareScriptPath: 'scripts/prepare-monorepo.mjs',
  onBeforeBuild: async () => {
    console.log('🔨 Running custom pre-build tasks...');
    // Custom validation, code generation, etc.
  },
  onAfterBuild: async () => {
    console.log('✅ Build completed, running post-build tasks...');
    // Asset optimization, cache warming, etc.
  }
});

await launcher.launch();

Port Conflict Handling

const launcher = new TypeScriptServerLauncher({
  serverDir: 'backend',
  devCommand: ['npm', 'run', 'dev'],
  startCommand: ['npm', 'run', 'start'],
  preferredPort: 2000,
  forcePort: true,  // Kill existing processes on port 2000
  maxRetries: 5,    // Retry up to 5 times on EADDRINUSE
  onServerStarted: async (port, pid) => {
    if (port !== 2000) {
      console.warn(`⚠️  Using fallback port ${port} instead of 2000`);
    }
  }
});

await launcher.launch();

Standalone Utilities

The launcher is built on modular utilities that can be used independently:

Enhanced Port Finder

import { findAvailablePort, isPortAvailable } from '@ai-universe/mcp-server-utils';

// Randomized search in range (avoids thundering herd)
const port = await findAvailablePort({
  minPort: 2000,
  maxPort: 2500
});

// Check specific port
const available = await isPortAvailable(2000);

Process Manager

import {
  terminateProcessByPid,
  reclaimPort,
  readPidFile,
  writePidFile
} from '@ai-universe/mcp-server-utils';

// Graceful/forceful termination
await terminateProcessByPid(12345);

// Reclaim port by killing processes
await reclaimPort(2000);

// PID file management
writePidFile('/tmp/server.pid', process.pid);
const pid = readPidFile('/tmp/server.pid');

Build Orchestrator

import {
  checkAndInstallDependencies,
  buildTypeScript,
  runCustomPrepareScript
} from '@ai-universe/mcp-server-utils';

// Ensure dependencies are current
checkAndInstallDependencies('/path/to/backend');

// Build TypeScript project
buildTypeScript('/path/to/backend', ['npm', 'run', 'build']);

// Run custom preparation
runCustomPrepareScript('/path/to/script.mjs', '/path/to/workdir');

Environment Variables

The launcher respects these environment variables (can be overridden in config):

  • RUN_LOCAL_SERVER_PREFERRED_PORT - Override preferred port
  • RUN_LOCAL_SERVER_MIN_PORT - Override minimum port for fallback
  • RUN_LOCAL_SERVER_MAX_PORT - Override maximum port for fallback
  • CORS_ALLOWED_ORIGINS - CORS configuration for server
  • FASTMCP_LOOPBACK_PORT - MCP loopback port (if applicable)

Integration Example (AI Universe)

See scripts/run-local-server.ts for a real-world example of integrating TypeScriptServerLauncher with:

  • Shared library preparation (prepare-shared-libs.mjs)
  • Custom logging and lifecycle hooks
  • Environment-based configuration
  • macOS Terminal integration via wrapper script

Error Handling

The launcher includes comprehensive error handling:

  • Port Conflicts: Automatic retry with new port selection
  • Build Failures: Clear error messages with exit codes
  • Process Crashes: Graceful shutdown and cleanup
  • Signal Handling: Proper cleanup on SIGINT/SIGTERM

Best Practices

  1. Always use PID files for reliable process tracking across restarts
  2. Enable watch mode in development for fast iteration
  3. Use forcePort sparingly - it kills existing processes
  4. Implement lifecycle hooks for custom logging and monitoring
  5. Test with watchMode: false to verify production behavior

Tool helper catalog

| Helper | Description | Output | | ------ | ----------- | ------ | | createEchoTool() | Quick connectivity check—echoes incoming message with timestamp. | JSON string { echo, timestamp } | | createVersionTool(version, info?) | Returns version metadata plus optional extra fields. | Pretty-printed JSON | | createStatusTool(statusProvider) | Wraps any status provider function and adds timestamps. | Pretty-printed JSON | | createHealthCheckTool(version, info?) | Standard health endpoint that pairs nicely with HTTP /health. | Pretty-printed JSON | | withErrorHandling(tool, handler?) | Converts thrown errors into structured JSON payloads with timestamps; custom handler can map errors to friendly messages. | JSON string { error: true, message, timestamp } | | withLogging(tool, logger?) | Logs parameters, success duration, and failures using provided logger or console.log. | Returns original tool result |

Compose them to instrument your own tools without rewriting boilerplate:

import { z } from 'zod';

const rawTool = {
  name: 'plan-trip',
  description: 'Generate itineraries',
  parameters: z.object({ destination: z.string() }),
  execute: async ({ destination }) => `Planning trip to ${destination}`,
};

const safeTool = withLogging(withErrorHandling(rawTool));

Bundled Conversation MCP client

@ai-universe/mcp-server-utils now re-exports every symbol from @ai-universe/mcp-conversation-client, letting you wire conversation-aware servers without installing a second package. The dependency is version-locked so your MCP tooling and client stay in sync.

import {
  ConversationMCPClient,
  ConversationAuthHandler,
} from '@ai-universe/mcp-server-utils';

const client = new ConversationMCPClient({
  serverUrl: 'https://ai-universe-convo-mcp.example.com',
  auth: new ConversationAuthHandler({
    getAccessToken: async () => process.env.CONVO_TOKEN!,
  }),
});

const { conversations } = await client.listConversations({ userId: 'user-123' });

All helpers, types, and error utilities from the dedicated conversation client package are available through this entrypoint. Existing imports from @ai-universe/mcp-conversation-client continue to work; the re-export simply removes the need for a separate installation step when you already depend on the server utilities bundle.

Logging, error handling, and security notes

  • Provide a logger with info/error when starting the proxy to surface port selection, startup milestones, and runtime failures to your observability stack.
  • Combine withErrorHandling and withLogging to guarantee client-friendly error payloads while still surfacing stack traces in logs.
  • CORS normalization plus wildcardToSafeRegex safeguards against unescaped regex characters in user-provided wildcard origins, mitigating injection vectors.
  • findAvailablePort enforces strict bounds, preventing runaway searches that could degrade host performance.
  • Toggle verbose latency instrumentation globally by setting MCP_VERBOSE_LATENCY_LOGGING=false. The flag defaults to true and takes precedence over the legacy SECOND_OPINION_VERBOSE_LOGGING toggle (which is only consulted when the new variable is unset). Persistence to /tmp/<repo>/<branch>/<flow>-<timestamp>.json artifacts continues regardless of verbosity settings.
  • Call LatencyTracker.logSummary(status, { includeContextKeys: [...] }) to emit the aggregated longest-stage/slowest-model report while allowlisting only non-sensitive context keys for log output.
import { setLatencyTrackerLogger } from '@ai-universe/mcp-server-utils';

setLatencyTrackerLogger(console);
// Set MCP_VERBOSE_LATENCY_LOGGING=false to disable structured latency logs

TypeScript reference

All exports are surfaced through the package root for ergonomic imports.

import {
  startFastMcpHttpProxy,
  findAvailablePort,
  createWildcardCorsOptions,
  wildcardToSafeRegex,
  createMCPServer,
  createHealthCheckTool,
  ServerLifecycleManager,
  setupGracefulShutdown,
  createEchoTool,
  createVersionTool,
  createStatusTool,
  withErrorHandling,
  withLogging,
} from '@ai-universe/mcp-server-utils';

Types such as StartFastMcpHttpProxyOptions, CreateServerOptions, ServerLifecycleHooks, and decorators’ parameter types are all exported for TypeScript projects to compose their own wrappers.

Testing your integration

While this package itself is framework-agnostic, recommended checks include:

  • Unit-test your agent registrations by asserting server.getTool('name') exists after calling createMCPServer.
  • Spin up the HTTP proxy with supertest against the returned Express app to verify routing and CORS responses.
  • Mock the httpModule option with node:http stubs if you need deterministic proxy behavior in unit tests.

Troubleshooting & FAQs

The proxy fails with “Bad Gateway”. Ensure the internal FastMCP HTTP stream is reachable. The error payload includes the underlying message; pipe it into your logging pipeline via the provided logger.

My wildcard origin isn’t matching. The helper trims trailing slashes and expects your pattern without them (https://*.example.com, not https://*.example.com/). Verify the incoming origin after Express normalization.

How do I stop the server cleanly in containers? Instantiate ServerLifecycleManager and call setupGracefulShutdown. It captures SIGINT/SIGTERM/SIGQUIT and exits 0 on success, 1 on failure.

Can I use STDIO transport instead of HTTP? Yes—ServerLifecycleManager is transport-agnostic. The HTTP proxy helper is optional; you can still use the factory, lifecycle, and tool helpers when running STDIO-only services.

Requirements

  • Node.js 18+
  • FastMCP 3.16.0+
  • Express 4.18+
  • cors 2.8+

Optional: Bring your own validation library (e.g., zod) for tool schemas if you need runtime parameter validation.

License

MIT

This package is part of the AI Universe project. Refer to the main repository for contribution guidelines.