npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@craftily/ai-sdk

v0.0.2

Published

A unified TypeScript/JavaScript SDK for interacting with multiple AI model providers, including OpenAI, Anthropic, Cohere, Gemini, Mistral, DeepSeek, Llama, and XAI. This SDK provides a consistent interface for generating text and working with various AI

Downloads

9

Readme

AI Model Client SDK

A unified TypeScript/JavaScript SDK for interacting with multiple AI model providers, including OpenAI, Anthropic, Cohere, Gemini, Mistral, DeepSeek, Llama, and XAI. This SDK provides a consistent interface for generating text and working with various AI models, making it easy to integrate state-of-the-art language models into your applications.

Features

  • Unified API for multiple AI providers
  • Easy switching between models and providers
  • TypeScript support with rich typings
  • Extensible and customizable
  • Supports custom system prompts, temperature, max tokens, and more

Installation

npm install @craftily/ai-sdk
# or
yarn add @craftily/ai-sdk

Usage: General AIClient

Note: The value of response returned by client.chat(...) depends on the provider. It may be a string, or it may be an object (e.g., with a .text property or other fields).

import AIClient from '@craftily/ai-sdk';

const client = new AIClient({
  modelId: 'gpt-4.1', // or any supported modelId
  provider: 'openai', // "openai", "google", "anthropic", "cohere", "meta", "mistral", "xai", "deepseek", "custom",
  apiKey: 'YOUR_API_KEY',
  maxTokens: 256,      // optional
  temperature: 0.7,    // optional
});

const response = await client.chat("What are the top tourist attractions in Tokyo, Japan?");

// Handle both string and object responses
if (typeof response === 'string') {
  console.log(response);
} else if (response && typeof response === 'object' && 'text' in response) {
  console.log(response.text);
} else {
  console.log(response); // fallback for other shapes
}

Usage Individual Clients

import { OpenAIClient, AnthropicClient, CohereClient /*, ... */ } from '@craftily/ai-sdk';
import { AIRequest } from '@craftily/ai-sdk';

// Example: Using OpenAIClient
const client = new OpenAIClient({ apiKey: 'YOUR_OPENAI_API_KEY' });

const request: AIRequest = {
  prompt: "What are the top tourist attractions in Tokyo, Japan?",
  modelId: "gpt-4.1", // or any supported model
  temperature: 0.7,
  maxTokens: 256,
};

const response = await client.generate(request);

// Handle both string and object responses
if (typeof response === 'string') {
  console.log(response);
} else if (response && typeof response === 'object' && 'text' in response) {
  console.log(response.text);
} else {
  console.log(response); // fallback for other shapes
}

This approach allows you to switch between providers and models dynamically by changing the modelId and provider in the configuration.

Supported Providers & Models

  • OpenAI: GPT-4, GPT-3.5, etc.
  • Anthropic: Claude models
  • Cohere: Command models
  • Google Gemini: Gemini models
  • Mistral: Mistral models
  • DeepSeek: DeepSeek models
  • Llama: Llama models
  • XAI: XAI models

API Reference

AIRequest

| Field | Type | Description | | ------------- | -------- | ---------------------------------------------- | | prompt | string | The prompt/question for the model | | modelId | string | The model identifier (see supported models) | | temperature | number | (Optional) Sampling temperature | | maxTokens | number | (Optional) Maximum tokens in the response |

Client Classes

Each provider has its own client class (e.g., OpenAIClient, AnthropicClient, etc.), all exposing a .generate(request: AIRequest) method that returns an AIResponse.

AIResponse

| Field | Type | Description | | ------------- | -------- | ----------------------------- | | text | string | The generated text | | ... | ... | Other provider-specific fields |

Contributing

  1. Fork the repository
  2. Create a new branch
  3. Make your changes
  4. Submit a pull request

License

MIT