npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@llm-dev-ops/connector-hub-core

v0.1.0

Published

Core interfaces, types, and models for LLM Connector Hub

Readme

@llm-connector-hub/core

Core interfaces, types, and utilities for the LLM Connector Hub.

Overview

This package provides the foundational building blocks for creating unified LLM provider integrations:

  • Interfaces: Define contracts for providers, middleware, cache, rate limiters, and circuit breakers
  • Models: Type-safe data models for requests, responses, messages, and configurations
  • Errors: Comprehensive error hierarchy with detailed error information
  • Validation: Zod-based schemas for runtime validation

Installation

npm install @llm-connector-hub/core

Features

Core Interfaces

  • IProvider: Unified interface for LLM providers (OpenAI, Anthropic, etc.)
  • IMiddleware: Middleware pipeline for cross-cutting concerns
  • ICache: Cache interface for response caching
  • IRateLimiter: Rate limiting to prevent API quota exhaustion
  • ICircuitBreaker: Circuit breaker pattern for fault tolerance

Data Models

  • CompletionRequest/Response: Normalized request/response formats
  • Message: Support for text, images, and tool calls
  • Config: Type-safe configuration management
  • Health: Health check and monitoring
  • Metrics: Performance and usage metrics

Error Handling

  • BaseError: Base class with rich error metadata
  • ProviderError: Provider-specific errors with retry logic
  • ValidationError: Field-level validation errors
  • RateLimitError: Rate limit information and retry timing
  • CircuitBreakerError: Circuit breaker state information

Validation

Zod-based schemas for all models with:

  • Runtime type validation
  • Detailed error messages
  • Type inference

Usage

Creating a Completion Request

import { CompletionRequestBuilder, MessageBuilder } from '@llm-connector-hub/core';

const request = new CompletionRequestBuilder()
  .withModel('gpt-4')
  .addMessage(MessageBuilder.system('You are a helpful assistant'))
  .addMessage(MessageBuilder.user('Hello!'))
  .withTemperature(0.7)
  .withMaxTokens(100)
  .build();

Implementing a Provider

import { IProvider, CompletionRequest, CompletionResponse } from '@llm-connector-hub/core';

class MyProvider implements IProvider {
  readonly name = 'my-provider';
  readonly version = '1.0.0';
  readonly capabilities = {
    streaming: true,
    functionCalling: true,
    vision: false,
    embeddings: false,
    maxContextWindow: 8192,
    supportedModels: ['model-1', 'model-2'],
  };

  async complete(request: CompletionRequest): Promise<CompletionResponse> {
    // Implementation
  }

  // ... other methods
}

Using Validation

import { validateCompletionRequest } from '@llm-connector-hub/core';

try {
  const validRequest = validateCompletionRequest(data);
  // Use validRequest
} catch (error) {
  if (error instanceof ValidationError) {
    console.error('Validation errors:', error.getFieldErrors());
  }
}

Error Handling

import { ProviderError, RateLimitError, CircuitBreakerError } from '@llm-connector-hub/core';

try {
  const response = await provider.complete(request);
} catch (error) {
  if (error instanceof RateLimitError) {
    const retryAfter = error.getRetryAfter();
    console.log(`Rate limited. Retry in ${retryAfter}ms`);
  } else if (error instanceof CircuitBreakerError) {
    console.log(`Circuit breaker open for ${error.getService()}`);
  } else if (error instanceof ProviderError) {
    console.log(`Provider error: ${error.message}`);
  }
}

TypeScript Support

This package is written in TypeScript with strict mode enabled and provides full type definitions.

All interfaces and types are exported for use in your TypeScript projects.

License

MIT