npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@filipgorny/ai-connect

v0.0.1

Published

AI/LLM connection library with protocol support and more

Readme

@filipgorny/ai-connect

A TypeScript library for connecting to Large Language Models (LLMs) with protocol-based communication and dependency injection support.

Features

  • Multiple LLM Providers: Support for OpenAI and extensible provider system
  • Protocol-Based Communication: Define structured response formats using LlmProtocol
  • Dependency Injection: Built on @filipgorny/di for flexible architecture
  • Type-Safe: Full TypeScript support with decorators and metadata
  • Chat Interface: Simple chat API for conversational interactions
  • Structured Responses: Parse LLM responses into typed fields

Installation

npm install @filipgorny/ai-connect @filipgorny/di reflect-metadata

Quick Start

Basic Chat

import "reflect-metadata";
import { Container } from "@filipgorny/di";
import { Llm, OpenAIProvider } from "@filipgorny/ai-connect";

// Setup DI container
const container = new Container();
container.registerInstance(
  "LlmProvider",
  new OpenAIProvider("your-openai-api-key"),
);
container.register("Llm", Llm);

// Get LLM instance
const llm = container.get<Llm>("Llm");

// Create chat and send message
const chat = llm.createChat();
const response = await chat.message("Hello, how are you?");
console.log(response.message); // "I'm doing well, thank you for asking!"

Protocol-Based Communication

import { LlmProtocol, ChannelInput } from "@filipgorny/ai-connect";

// Define protocol
const protocol = new LlmProtocol()
  .defineInputField("task", "The task description")
  .defineOutputField("priority", "Task priority: low, medium, high")
  .defineOutputField("deadline", "Suggested deadline");

// Create channel
const channel = llm.createProtocolChannel(protocol);

// Send structured input
const input = new ChannelInput({ task: "Review the quarterly report" });
const response = await channel.send(input);

// Access parsed fields
if (response.valid) {
  console.log("Priority:", response.fields.get("priority"));
  console.log("Deadline:", response.fields.get("deadline"));
}

API Reference

Core Classes

Llm

Main entry point for LLM operations.

  • createChat(): Chat - Create a chat instance
  • createProtocolChannel(protocol: LlmProtocol): LlmCommunicationChannel - Create protocol-based channel

Chat

Simple conversational interface.

  • message(message: string): Promise<ChatResponse> - Send message and get response

ChatResponse

Response from chat interactions.

  • message: string - The response text
  • toString(): string - Returns the message

LlmProtocol

Defines input/output schemas for structured communication.

  • defineInputField(name: string, description: string): this
  • defineOutputField(name: string, description: string): this
  • getInputFields(): InputField[]
  • getOutputFields(): OutputField[]

ChannelInput

Input for protocol channels.

  • constructor(plainObject: Record<string, string>)
  • set(name: string, value: string): void
  • get(name: string): string | undefined
  • toObject(): Record<string, string>

LlmCommunicationChannel

Handles protocol-based LLM communication.

  • send(message: ChannelInput): Promise<ProtocolResponse> - Send input and get structured response

ProtocolResponse

Structured response from protocol channels.

  • valid: boolean - Whether the response was successfully parsed
  • fields: OutputFieldValueCollection - Parsed field values
  • isValid(): boolean - Check if fields are populated

OutputFieldValueCollection

Collection of parsed field values.

  • get(name: string): string | undefined - Get field value
  • getAll(): OutputFieldValue[] - Get all field values

Providers

OpenAIProvider

OpenAI GPT integration.

const provider = new OpenAIProvider("your-api-key");

MockProvider

Mock provider for testing.

const provider = new MockProvider();
provider.addResponse(new ProviderResponse("Mock response"));

Configuration

TypeScript Setup

Ensure your tsconfig.json includes:

{
  "compilerOptions": {
    "experimentalDecorators": true,
    "emitDecoratorMetadata": true,
    "baseUrl": ".",
    "paths": {
      "@/*": ["src/*"]
    }
  }
}

Dependency Injection

The library uses @filipgorny/di for dependency injection. Register the LlmProvider in your container:

const container = new Container();
container.registerInstance("LlmProvider", new OpenAIProvider("api-key"));
container.register("Llm", Llm);

Testing

Run tests:

npm test

Run OpenAI integration test:

npm run open-ai-test YOUR_API_KEY

Building

npm run build

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests
  5. Submit a pull request

License

MIT