npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

ai-providers

v2.1.3

Published

Unified AI provider registry with Cloudflare AI Gateway support

Readme

ai-providers

Stop juggling API keys. Start building.

You're building AI features, not managing provider configurations. But every model needs its own SDK, its own API key, its own quirks. OpenAI, Anthropic, Google, Llama, Mistral... each one is another dependency to install, another secret to manage, another authentication pattern to remember.

What if you could just say model('sonnet') and it worked?

The Problem

// Before: Provider chaos
import Anthropic from '@anthropic-ai/sdk'
import OpenAI from 'openai'
import { GoogleGenerativeAI } from '@google/generative-ai'

const anthropic = new Anthropic({ apiKey: process.env.ANTHROPIC_API_KEY })
const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY })
const google = new GoogleGenerativeAI(process.env.GOOGLE_API_KEY)

// Different APIs, different patterns, different headaches

The Solution

// After: One import, any model
import { model } from 'ai-providers'
import { generateText } from 'ai'

const { text } = await generateText({
  model: await model('sonnet'),  // Just works
  prompt: 'Hello!'
})

// Switch models in seconds
await model('opus')        // Anthropic Claude Opus 4.5
await model('gpt-4o')      // OpenAI GPT-4o
await model('gemini')      // Google Gemini 2.5 Flash
await model('llama-70b')   // Meta Llama 3.3 70B
await model('deepseek')    // DeepSeek Chat
await model('mistral')     // Mistral Large

Quick Start

1. Install

pnpm add ai-providers ai

2. Configure (choose one)

Option A: Cloudflare AI Gateway (Recommended)

One gateway, all providers, zero API key management:

export AI_GATEWAY_URL=https://gateway.ai.cloudflare.com/v1/{account_id}/{gateway_name}
export AI_GATEWAY_TOKEN=your-gateway-auth-token

Option B: Direct API Keys

export OPENROUTER_API_KEY=sk-or-...  # Access 200+ models

3. Build

import { model } from 'ai-providers'
import { generateText } from 'ai'

const { text } = await generateText({
  model: await model('sonnet'),
  prompt: 'What is the meaning of life?'
})

That's it. No provider-specific SDKs. No authentication dance. Just AI.

How It Works

ai-providers is your guide through the AI provider landscape:

Your Code
    │
    ▼
┌─────────────────┐
│   ai-providers  │  Resolves aliases, routes intelligently
└────────┬────────┘
         │
    ┌────┴────┐
    ▼         ▼
┌───────┐ ┌──────────┐
│Direct │ │OpenRouter│
│SDK    │ │          │
└───────┘ └──────────┘
    │         │
    ▼         ▼
Anthropic   200+ models
OpenAI      from any
Google      provider

Smart routing gives you the best of both worlds:

  • Direct SDK access for OpenAI, Anthropic, and Google - enabling provider-specific features like MCP, extended thinking, and structured outputs
  • OpenRouter fallback for everything else - 200+ models with automatic failover

Model Aliases

Simple names that just work:

| Alias | Model | |-------|-------| | opus | Claude Opus 4.5 | | sonnet | Claude Sonnet 4.5 | | haiku | Claude Haiku 4.5 | | gpt-4o | GPT-4o | | o1, o3 | OpenAI o1, o3 | | gemini | Gemini 2.5 Flash | | llama | Llama 4 Maverick | | deepseek, r1 | DeepSeek Chat, R1 | | mistral | Mistral Large | | qwen | Qwen3 235B | | grok | Grok 3 |

Or use full model IDs:

await model('anthropic/claude-opus-4.5')
await model('mistralai/codestral-2501')
await model('meta-llama/llama-3.3-70b-instruct')

Embeddings

import { embeddingModel } from 'ai-providers'
import { embed } from 'ai'

const model = await embeddingModel('openai:text-embedding-3-small')
const { embedding } = await embed({ model, value: 'Hello world' })

// Or use Cloudflare Workers AI
const cfModel = await embeddingModel('cloudflare:@cf/baai/bge-m3')

Advanced Usage

Custom Registry

import { createRegistry } from 'ai-providers'

const registry = await createRegistry({
  gatewayUrl: 'https://gateway.ai.cloudflare.com/v1/...',
  gatewayToken: 'your-token'
})

const model = registry.languageModel('anthropic:claude-sonnet-4-5-20251101')

Direct Provider Access

When you need provider-specific features:

// Bedrock with bearer token auth
await model('bedrock:us.anthropic.claude-3-5-sonnet-20241022-v2:0')

// Direct provider routing
await model('openai:gpt-4o')
await model('anthropic:claude-sonnet-4-5-20251101')
await model('google:gemini-2.5-flash')

Why Cloudflare AI Gateway?

When configured with AI Gateway:

  1. One token authenticates everything - gateway injects provider keys from its secrets
  2. Unified logging - see all AI calls in one dashboard
  3. Rate limiting - protect your budget across providers
  4. Caching - reduce costs with intelligent response caching
  5. Fallback routing - automatic failover if a provider is down

No gateway? No problem. Set individual API keys and ai-providers works the same way.

What You Get

With ai-providers, you can:

  • Ship faster - one import, any model, zero config
  • Stay flexible - switch providers without code changes
  • Build with confidence - production-ready with Cloudflare AI Gateway
  • Access everything - 200+ models through OpenRouter, native SDK features through direct routing

Stop wrestling with provider APIs. Start building AI features.