npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

pulse-ai-utils

v3.45.20

Published

Utility functions and helpers for AI-powered applications

Readme

Pulse AI Utils

A powerful TypeScript library for AI-powered applications with multi-provider LLM support. Provides unified interfaces for OpenAI, Gemini, Claude, and 100+ models via OpenRouter.

✨ Features

  • 🤖 Multi-Provider Support: OpenAI, Gemini, Claude, and 100+ models
  • 🔄 Unified Interface: Same API for all providers
  • 🔑 BYOK Support: Bring Your Own Key for provider-specific APIs
  • 📊 Structured Data: Built-in Zod schema validation
  • 🌐 Web Search: AI-powered web queries with caching
  • 🎯 Type-Safe: Full TypeScript support with proper types

Installation

npm install pulse-ai-utils

🚀 Quick Start

import { OpenAIHelper, OpenRouter } from 'pulse-ai-utils';

// Auto-loads API keys from your .env file
const openai = new OpenAIHelper();  // Uses OPENAI_API_KEY from .env

// Auto-loads keys for different providers
const gemini = OpenRouter.forGemini();  // Uses OPENROUTER_API_KEY + GEMINI_API_KEY from .env
const claude = OpenRouter.forClaude();  // Uses OPENROUTER_API_KEY + CLAUDE_API_KEY from .env

// Use remote config for dynamic model selection (recommended)
const smartOpenai = await OpenAIHelper.createWithRemoteConfig();
const smartGemini = await OpenRouter.forGeminiWithRemoteConfig();
const webOptimized = await OpenRouter.createForWebFetching();

// Or pass keys explicitly if needed
const customOpenai = new OpenAIHelper('your-openai-key');
const customGemini = OpenRouter.forGemini('openrouter-key', 'gemini-key');

Environment Configuration

The library automatically reads API keys from your project's root .env file. Simply create a .env file in your project root:

# .env file in your project root
OPENAI_API_KEY=your-openai-key
OPENROUTER_API_KEY=your-openrouter-key
GEMINI_API_KEY=your-gemini-key
CLAUDE_API_KEY=your-claude-key

No need to call dotenv.config() - the library handles this automatically!

Remote Configuration (Firebase Remote Config)

The library also supports dynamic model selection via Firebase Remote Config. This allows you to change models without code deployments:

# Firebase Remote Config Parameters (optional)
pulse-ai-util-openai-model=gpt-4o-mini                    # OpenAI model selection
pulse-ai-util-openrouter-model=google/gemini-2.0-flash-exp  # OpenRouter model
pulse-ai-util-gemini-model=google/gemini-2.0-flash-exp      # Gemini-specific model
pulse-ai-web-openrouter-model=claude-3-5-sonnet-20241022    # Optimized for web fetching

These remote config values take precedence over environment variables, providing dynamic configuration capabilities.

Required Environment Variables

At least one of these is required:

OPENAI_API_KEY=your-openai-key                    # For direct OpenAI access
OPENROUTER_API_KEY=your-openrouter-key            # For multi-provider access via OpenRouter

Optional Environment Variables

# Provider-specific keys for BYOK (Bring Your Own Key)
GEMINI_API_KEY=your-gemini-key                    # Google AI Studio key
CLAUDE_API_KEY=your-claude-key                    # Anthropic Claude key

# Model Selection (optional - smart defaults provided)
OPENROUTER_MODEL=google/gemini-2.0-flash-exp      # Default OpenRouter model
GEMINI_MODEL=google/gemini-2.0-flash-exp          # Default Gemini model

# Optional Configuration
OPENROUTER_BASE_URL=https://openrouter.ai/api/v1  # Custom OpenRouter URL
LLM_CACHE_DB_ID=llmCache                          # Firestore cache collection

Usage

LLMQueryHandler

Fetch arbitrary structured data using web search.

import { LLMQueryHandler } from 'pulse-ai-utils';

const queryHandler = new LLMQueryHandler('your-api-key');

// Use in an Express route
app.post('/llm-query', (req, res) => queryHandler.query(req, res));

🤖 LLM Providers

OpenAI Helper - Direct OpenAI API

import { OpenAIHelper } from 'pulse-ai-utils';

// Auto-loads from OPENAI_API_KEY env var, or pass explicitly
const openai = new OpenAIHelper(undefined, undefined, 'gpt-4o-mini');
// Or with explicit key: new OpenAIHelper('your-api-key', undefined, 'gpt-4o-mini');

// 🌟 Recommended: Use remote config for dynamic model selection
const smartOpenai = await OpenAIHelper.createWithRemoteConfig();

// Update model from remote config for long-running processes
await openai.updateModelFromRemoteConfig();

// Fetch structured data from the web  
const result = await openai.fetchStructuredDataFromWeb({
  prompt: 'Find upcoming tech events in San Francisco',
  zodSchema: yourZodSchema,
  userLocation: { 
    type: 'approximate',
    country: 'US', 
    region: 'CA',
    city: 'San Francisco'
  },
  locationGranularity: 'city',
});

// Get available OpenAI models
const models = await openai.getAvailableModels();
// Returns: ['gpt-4o-mini', 'gpt-4', 'gpt-3.5-turbo', ...]

OpenRouter - Universal Multi-Provider Access

Access 100+ models from multiple providers through a unified interface:

import { OpenRouter } from 'pulse-ai-utils';

// Auto-loads from environment variables (.env file)
const router = new OpenRouter();

// Factory methods auto-load from .env - no keys needed!
const gemini = OpenRouter.forGemini();  // Uses OPENROUTER_API_KEY + GEMINI_API_KEY
const claude = OpenRouter.forClaude();  // Uses OPENROUTER_API_KEY + CLAUDE_API_KEY  
const gpt = OpenRouter.forGPT();        // Uses OPENROUTER_API_KEY

// 🌟 Recommended: Use remote config for dynamic model selection
const smartRouter = await OpenRouter.createWithRemoteConfig();
const smartGemini = await OpenRouter.forGeminiWithRemoteConfig();
const webOptimized = await OpenRouter.createForWebFetching(); // Uses pulse-ai-web-openrouter-model

// Update model from remote config for long-running processes
await router.updateModelFromRemoteConfig();
await router.updateModelFromRemoteConfig(true); // Use web-optimized model

// Or pass keys explicitly if needed
const customGemini = OpenRouter.forGemini('openrouter-key', 'gemini-key');
const customClaude = OpenRouter.forClaude('openrouter-key', 'claude-key');

// Get available models with provider info
const models = await router.getAvailableModels();
// Returns: [
//   { id: 'google/gemini-2.0-flash-exp', name: 'Gemini 2.0 Flash', provider: 'Google' },
//   { id: 'anthropic/claude-3.5-sonnet', name: 'Claude 3.5 Sonnet', provider: 'Anthropic' },
//   ...
// ]

Model Selection Guide

// 🌟 Best: Remote config with dynamic model selection (recommended)
const smartOpenai = await OpenAIHelper.createWithRemoteConfig();
const smartGemini = await OpenRouter.forGeminiWithRemoteConfig();
const webOptimized = await OpenRouter.createForWebFetching(); // Special web-optimized model

// ✅ Good: Environment variables (auto-loads from .env)
const openai = new OpenAIHelper();
const gemini = OpenRouter.forGemini();
const claude = OpenRouter.forClaude();

// ✅ Fallback: Explicit configuration
const customGemini = OpenRouter.forGemini(undefined, undefined, 'google/gemini-2.0-flash-exp');
const customClaude = OpenRouter.forClaude(undefined, undefined, 'anthropic/claude-3.5-sonnet');

Remote Config Priority Order

  1. Firebase Remote Config (highest priority) - pulse-ai-util-*-model
  2. Environment Variables (.env file) - OPENAI_MODEL, OPENROUTER_MODEL, etc.
  3. Default Values (fallback) - gpt-4o-mini, google/gemini-2.0-flash-exp, etc.

Utility Functions

import { 
  getSchemaByCategory, 
  sanitizeId, 
  zodToJsonSchema 
} from 'pulse-ai-utils';

// Get schema for a category
const schema = getSchemaByCategory('events');

// Sanitize an ID
const cleanId = sanitizeId('https://example.com/path/');
// Result: 'example.com-path'

// Convert Zod schema to JSON schema
const jsonSchema = zodToJsonSchema(myZodSchema);

License

0BSD

Versioning and Publishing

To release a new version of this package:

  1. Open package.json in this directory and update the version field to the desired version tag (for example, "3.4.1").
  2. In your terminal, ensure you're in this directory:
    cd lib
  3. Build and publish to npm:
    npm run build
    npm publish

Your new version will be published under the latest tag on npm.