npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@future-explorer/lib

v2.0.2

Published

Shared utilities and clients for Future Explorer projects

Readme

Future Explorer Lib

Shared utilities and clients for Future Explorer projects.

Installation

npm install @future-explorer/lib

GrokAiClient

AI client for interacting with Grok (xAI) models with structured output support.

Basic Usage

import { GrokAiClient } from '@future-explorer/lib';

const client = new GrokAiClient({
  apiKey: 'your-api-key', // or set XAI_API_KEY env var
  temperature: 0.1,
  maxTokens: 4096,
  logger: console, // optional
});

interface PersonInfo {
  name: string;
  age: number;
}

const response = await client.getGenericStructuredResponse<PersonInfo>({
  model: 'grok-2-latest',
  messages: [
    { role: 'system', content: 'You are a helpful assistant.' },
    { role: 'user', content: 'Extract data from this text...' },
  ],
  tools: [
    {
      type: 'function',
      name: 'extract_info',
      description: 'Extract structured information',
      parameters: {
        type: 'object',
        properties: {
          name: { type: 'string' },
          age: { type: 'number' },
        },
        required: ['name', 'age'],
      },
    },
  ],
});

if (response) {
  console.log(response.args); // { name: '...', age: ... }
  console.log(response.functionName); // 'extract_info'
}

Constructor Options

  • apiKey (optional): xAI API key. Falls back to XAI_API_KEY env var
  • temperature (optional): Default temperature for requests (default: 0.1)
  • maxTokens (optional): Default max tokens (default: 4096)
  • logger (optional): Logger instance with warn and error methods

UnifiedAiClient

Multi-provider AI client supporting OpenAI, XAI (Grok), and Google Gemini with Zod schema-based structured outputs.

Basic Usage

import { UnifiedAiClient, Provider } from '@future-explorer/lib';
import { z } from 'zod';

// Create client with desired provider
const client = new UnifiedAiClient(Provider.OpenAI);
// or Provider.XAI, Provider.Gemini

// Define a Zod schema for the response
const SentimentSchema = z.object({
  sentiment: z.enum(['positive', 'negative', 'neutral']),
  confidence: z.number().min(0).max(1),
  summary: z.string(),
});

// Generate structured response
const result = await client.generateStructuredResponse(SentimentSchema, {
  prompt: 'I absolutely love this product!',
  system: 'You are a sentiment analysis expert.',
});

console.log(result.sentiment); // 'positive'
console.log(result.confidence); // 0.95
console.log(result.summary); // '...'

Providers

| Provider | Enum Value | Required Environment Variables | | ------------- | ----------------- | ---------------------------------------------- | | OpenAI | Provider.OpenAI | OPENAI_API_KEY, MODEL_OPEN_AI | | XAI (Grok) | Provider.XAI | XAI_API_KEY, MODEL_XAI | | Google Gemini | Provider.Gemini | GOOGLE_GENERATIVE_AI_API_KEY, MODEL_GEMINI |

Methods

  • generateStructuredResponse<T>(schema, params, usageTracker?): Generates a structured response matching the Zod schema. Accepts prompt, system, plus any additional options. Optionally pass an AiUsageTracker to record usage.
  • getModel(): Returns the underlying LanguageModel instance

AiUsageTracker

Tracks and accumulates AI usage costs across multiple generateStructuredResponse() calls. Useful for calculating the total cost of processing a single input that requires multiple AI calls.

Basic Usage

import { UnifiedAiClient, AiUsageTracker, Provider } from '@future-explorer/lib';
import { z } from 'zod';

const client = new UnifiedAiClient(Provider.XAI);
const tracker = new AiUsageTracker();

const Schema = z.object({ summary: z.string() });

// Each call automatically records usage in the tracker
await client.generateStructuredResponse(Schema, { prompt: 'First call...' }, tracker);
await client.generateStructuredResponse(Schema, { prompt: 'Second call...' }, tracker);

// Get accumulated results
console.log(tracker.getSummary());
// AI Usage: 2 call(s), $0.001234 estimated
//   Tokens - input: 500, output: 200, reasoning: 0, total: 700
//   [xai/grok-4-1-fast-reasoning] in=250 out=100 $0.000617 3200ms
//   [xai/grok-4-1-fast-reasoning] in=250 out=100 $0.000617 2800ms

console.log(tracker.getTotalCost()); // 0.001234
console.log(tracker.getTotalTokens()); // { input: 500, output: 200, reasoning: 0, total: 700 }
console.log(tracker.getCalls()); // AiCallUsage[]

Supported Models

Built-in pricing for cost estimation (easily extensible):

| Provider | Models | | -------- | ------------------------------------------------------------------------------ | | xAI | grok-4-1-fast, grok-4-fast, grok-4, grok-3, grok-3-mini, grok-2 | | OpenAI | gpt-4.1, gpt-4.1-mini, gpt-4.1-nano, gpt-4o, gpt-4o-mini, o3, o3-mini, o4-mini | | Gemini | gemini-2.5-pro, gemini-2.5-flash, gemini-2.5-flash-lite, gemini-2.0-flash |

Model IDs are matched by substring, so grok-4-1-fast-reasoning matches the grok-4-1-fast pricing entry.

Development

Build

npm run build

Watch Mode

npm run watch

Lint

npm run lint
npm run lint:fix

Local Development

Link the package locally for testing in other projects:

./scripts/link-local.sh

Then in your project:

npm link @future-explorer/lib

Publishing

Manual Publish

npm run build
npm publish

Using Script

./scripts/publish.sh [patch|minor|major]

Changelog

1.0.13

  • generateStructuredResponse() now logs per-call usage to console

1.0.12

  • Added AiUsageTracker for tracking AI usage costs across multiple calls
  • generateStructuredResponse() now accepts optional usageTracker parameter

1.0.11

  • Moved schema out of GenerateObjectParams into a separate parameter
  • Renamed userPrompt/systemMessage to prompt/system

1.0.10

  • Refactored generateStructuredResponse() to accept a single GenerateObjectParams<T> object

1.0.7

  • Updated peer dependency to zod 4.2.x

1.0.6

  • Added UnifiedAiClient with multi-provider support (OpenAI, XAI, Gemini)

License

ISC