npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

superprompts

v0.0.4

Published

Superprompts is a library for creating and using prompts for AI models.

Readme

Superprompts

A TypeScript/JavaScript library for fetching and using prompts from the Superprompts API with built-in caching for optimal performance. For more information visit https://superprompts.app.

Features

  • 🚀 Fast: Built-in 5-second caching to reduce API calls
  • 🔧 TypeScript: Full TypeScript support with type definitions
  • 📦 Lightweight: Zero dependencies, minimal bundle size
  • 🎯 Simple: Easy-to-use API for prompt management

Installation

npm install superprompts
# or
yarn add superprompts
# or
pnpm add superprompts

Quick Start

import { createPromptClient } from 'superprompts';

const promptClient = createPromptClient({
    apiKey: 'your-superprompts-api-key'
});

// Fetch a prompt by ID
const prompt = await promptClient('your-prompt-id');
console.log(prompt.prompt); // The actual prompt text

API Reference

createPromptClient({ apiKey })

Creates a prompt client instance with caching enabled.

Parameters:

  • apiKey (string): Your Superprompts API key

Returns:

  • A function that takes a promptId and returns a Promise with the prompt data

promptClient(promptId)

Fetches a prompt from the Superprompts API.

Parameters:

  • promptId (string): The ID of the prompt to fetch

Returns:

  • Promise containing the prompt data with the following structure:
    {
        prompt: string; // The actual prompt text
        // ... other metadata from the API
    }

Examples

Basic Usage

import { createPromptClient } from 'superprompts';

const promptClient = createPromptClient({
    apiKey: process.env.SUPERPROMPTS_API_KEY
});

async function generateContent() {
    try {
        const promptData = await promptClient('my-prompt-id');
        console.log('Retrieved prompt:', promptData.prompt);
    } catch (error) {
        console.error('Failed to fetch prompt:', error);
    }
}

Using with OpenAI npm

import { createPromptClient } from 'superprompts';
import OpenAI from 'openai';

const openai = new OpenAI({
    apiKey: process.env.OPENAI_API_KEY
});

const promptClient = createPromptClient({
    apiKey: process.env.SUPERPROMPTS_API_KEY
});

async function generateWithOpenAI() {
    try {
        // Fetch the prompt from Superprompts
        const { prompt } = await promptClient('code-review-prompt');

        // Use the prompt with OpenAI
        const completion = await openai.chat.completions.create({
            model: 'gpt-4',
            messages: [
                {
                    role: 'system',
                    content: prompt
                },
                {
                    role: 'user',
                    content: 'Please review this code: function add(a, b) { return a + b; }'
                }
            ],
            temperature: 0.7
        });

        console.log('AI Response:', completion.choices[0].message.content);
    } catch (error) {
        console.error('Error:', error);
    }
}

generateWithOpenAI();

Using with AI SDK

import { createPromptClient } from 'superprompts';
import { openai } from 'ai/openai';
import { generateText } from 'ai';

const promptClient = createPromptClient({
    apiKey: process.env.SUPERPROMPTS_API_KEY
});

async function generateWithAISDK() {
    try {
        // Fetch the prompt from Superprompts
        const { prompt } = await promptClient('creative-writing-prompt');

        // Use the prompt with AI SDK
        const { text } = await generateText({
            model: openai('gpt-4'),
            system: prompt,
            prompt: 'Write a short story about a robot learning to paint.',
            temperature: 0.8
        });

        console.log('Generated story:', text);
    } catch (error) {
        console.error('Error:', error);
    }
}

generateWithAISDK();

Caching Example

import { createPromptClient } from 'superprompts';

const promptClient = createPromptClient({
    apiKey: process.env.SUPERPROMPTS_API_KEY
});

async function demonstrateCaching() {
    const promptId = 'my-prompt-id';

    console.time('First fetch');
    const prompt1 = await promptClient(promptId);
    console.timeEnd('First fetch'); // ~200ms (API call)

    console.time('Second fetch');
    const prompt2 = await promptClient(promptId);
    console.timeEnd('Second fetch'); // ~1ms (cached)

    console.log('Same prompt?', prompt1.prompt === prompt2.prompt); // true

    // Wait 6 seconds for cache to expire
    await new Promise((resolve) => setTimeout(resolve, 6000));

    console.time('Third fetch');
    const prompt3 = await promptClient(promptId);
    console.timeEnd('Third fetch'); // ~200ms (cache expired, new API call)
}

demonstrateCaching();

Caching

The library includes a built-in caching mechanism with a 5-second TTL (Time To Live). This means:

  • First request to a prompt ID: Fetches from API
  • Subsequent requests within 5 seconds: Returns cached result
  • After 5 seconds: Cache expires, next request fetches fresh data from API

This caching strategy balances performance with data freshness, ensuring you get fast responses while keeping prompts reasonably up-to-date.

Error Handling

The library throws errors for various scenarios:

try {
    const prompt = await promptClient('invalid-prompt-id');
} catch (error) {
    if (error.message.includes('Failed to retrieve prompt')) {
        console.error('API Error:', error.message);
        // Handle API errors (404, 401, 500, etc.)
    }
}

Environment Variables

Make sure to set your Superprompts API key:

# .env
SUPERPROMPTS_API_KEY=your-api-key-here
OPENAI_API_KEY=your-openai-api-key-here

TypeScript Support

The library is written in TypeScript and includes full type definitions:

import { createPromptClient } from 'superprompts';

// TypeScript will infer the return type
const promptClient = createPromptClient({
    apiKey: 'your-key'
});

// promptData is properly typed
const promptData = await promptClient('prompt-id');
console.log(promptData.prompt); // string

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

MIT © Superprompts, Inc

Links