npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@genui-a3/providers

v0.0.5

Published

Provider implementations for the A3 agentic framework

Downloads

439

Readme

@genui-a3/providers

npm version license

LLM provider implementations for the A3 agentic framework.

Ships with AWS Bedrock, Anthropic and OpenAI providers out of the box. Both support blocking and streaming modes, model fallback, and structured output via Zod schemas.

Install

npm install @genui-a3/providers @genui-a3/core

@genui-a3/core is a peer dependency — it must be installed alongside this package.

Quick Start

AWS Bedrock

import { createBedrockProvider } from '@genui-a3/providers/bedrock'

const provider = createBedrockProvider({
  models: ['us.anthropic.claude-sonnet-4-5-20250929-v1:0'],
  region: 'us-east-1', // optional, defaults to AWS SDK default
})

OpenAI

import { createOpenAIProvider } from '@genui-a3/providers/openai'

const provider = createOpenAIProvider({
  models: ['gpt-4o', 'gpt-4o-mini'],
  apiKey: process.env.OPENAI_API_KEY, // optional, defaults to OPENAI_API_KEY env var
})

Use with A3

import { ChatSession, MemorySessionStore } from '@genui-a3/core'

const session = new ChatSession({
  sessionId: 'user-123',
  store: new MemorySessionStore(),
  initialAgentId: 'greeting',
  initialState: {},
  provider, // any provider from above
})

// Blocking
const response = await session.send({ message: 'Hello!' })

// Streaming
for await (const event of session.send({ message: 'Hello!', stream: true })) {
  console.log(event)
}

Providers

Bedrock — createBedrockProvider(config)

Communicates with AWS Bedrock via the Converse API.

| Option | Type | Required | Description | |---|---|---|---| | models | string[] | Yes | Model IDs in preference order (first = primary, rest = fallbacks) | | region | string | No | AWS region. Defaults to AWS SDK default |

Behaviour:

  • Uses tool-based JSON extraction (structuredResponse tool) for reliable structured output
  • Streaming yields text deltas in real-time, then emits a validated tool-call result at the end
  • Merges sequential same-role messages to satisfy Bedrock's alternating-role requirement
  • Prepends an initial user message ("Hi") so the conversation always starts with a user turn

Prerequisites: AWS credentials configured via environment variables, IAM role, or AWS profile — the same setup the AWS SDK expects.


OpenAI — createOpenAIProvider(config)

Communicates with the OpenAI Chat Completions API using structured output (response_format: json_schema).

| Option | Type | Required | Description | |---|---|---|---| | models | string[] | Yes | Model IDs in preference order (first = primary, rest = fallbacks) | | apiKey | string | No | API key. Defaults to OPENAI_API_KEY env var | | baseURL | string | No | Custom base URL for Azure OpenAI or compatible endpoints | | organization | string | No | OpenAI organization ID |

Behaviour:

  • Uses structured output (response_format with json_schema) — no tool calls required
  • Enforces strict schemas automatically (additionalProperties: false, all properties required)
  • Streaming extracts chatbotMessage text progressively from the JSON response via a character-level state machine, yielding text deltas in real-time
  • Detects truncated responses (finish_reason: length) and surfaces them as errors

Prerequisites: An OpenAI API key, either passed directly or set as OPENAI_API_KEY.

Model Fallback

Both providers support automatic model fallback. List models in order of preference:

const provider = createBedrockProvider({
  models: [
    'us.anthropic.claude-sonnet-4-5-20250929-v1:0',  // primary
    'us.anthropic.claude-haiku-4-5-20251001-v1:0',   // fallback
  ],
})

If the primary model fails, the provider automatically retries with the next model in the list. If all models fail, the last error is thrown.

All providers include built-in resilience: automatic retries with exponential backoff, per-request and total timeouts, and model fallback. See the Resilience documentation for configuration options and defaults.

Per-Agent Provider Override

Each agent can override the session-level provider:

import { createOpenAIProvider } from '@genui-a3/providers/openai'
import { createBedrockProvider } from '@genui-a3/providers/bedrock'

// Session uses Bedrock by default
const session = new ChatSession({
  provider: createBedrockProvider({ models: ['us.anthropic.claude-sonnet-4-5-20250929-v1:0'] }),
  // ...
})

// This agent uses OpenAI instead
const premiumAgent = {
  id: 'premium',
  provider: createOpenAIProvider({ models: ['gpt-4o'] }),
  // ...
}

Provider Interface

Both providers implement the Provider interface from @genui-a3/core:

| Member | Description | |---|---| | sendRequest(request) | Blocking request → Promise<ProviderResponse> | | sendRequestStream(request) | Streaming request → AsyncGenerator<StreamEvent> | | name | Human-readable name ('bedrock' or 'openai') |

To create a custom provider, implement this interface and pass it to ChatSession or an individual agent. See Creating a Custom Provider for a step-by-step guide to building your own.

Exports

This package uses subpath exports. Import from the specific provider entry point:

// ✅ Correct
import { createBedrockProvider } from '@genui-a3/providers/bedrock'
import { createOpenAIProvider } from '@genui-a3/providers/openai'

// ❌ No bare import
import { ... } from '@genui-a3/providers'

| Entry point | Export | Description | |---|---|---| | @genui-a3/providers/bedrock | createBedrockProvider | Factory function returning a Bedrock Provider | | @genui-a3/providers/bedrock | BedrockProviderConfig | TypeScript config interface | | @genui-a3/providers/openai | createOpenAIProvider | Factory function returning an OpenAI Provider | | @genui-a3/providers/openai | OpenAIProviderConfig | TypeScript config interface |

Requirements

  • Node.js 20.19.0+
  • TypeScript 5.9+
  • @genui-a3/core ≥ 0.1.5 (peer dependency)
  • Bedrock: AWS credentials configured in the environment
  • OpenAI: OPENAI_API_KEY environment variable or apiKey config option

License

ISC