npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

promptly-ai

v1.3.1

Published

A universal template-based prompt management system for LLM applications

Downloads

22

Readme

Promptly AI

A universal template-based prompt management system for LLM applications. Promptly AI separates prompts from business logic, making them easier to maintain, version, and modify across different projects.

Features

  • 🎯 Template-based prompts - Separate prompts from code
  • 🔄 Multi-turn conversations - Support for dialogue flows
  • Conditional prompts - Include/exclude prompts based on context
  • 🧠 Extended thinking - Support for reasoning tokens
  • 🔌 Extensible adapters - Easy integration with different LLM providers
  • 📝 Handlebars templating - Powerful variable interpolation and logic
  • Type-safe - Full TypeScript support
  • Performance - Built-in caching for templates
  • �🛡️ Validation - Context validation with custom rules

Installation

npm install promptly-ai

Peer Dependencies

Install the LLM SDKs you plan to use:

# For OpenAI
npm install openai

# For Anthropic
npm install @anthropic-ai/sdk

Quick Start

1. Setup

import { createPromptly, OpenAIAdapter, AnthropicAdapter } from 'promptly-ai';

const promptly = createPromptly({
  templatesPath: './prompts',
  defaultModel: 'gpt-4-turbo',
  adapters: [
    new OpenAIAdapter(), // Uses OPENAI_API_KEY env var
    new AnthropicAdapter(), // Uses ANTHROPIC_API_KEY env var
  ],
});

2. Create a Template

Create ./prompts/greeting.md:

---
name: "greeting"
description: "Generate personalized greetings"
model: "gpt-4-turbo"
temperature: 0.7
maxTokens: 100
---

# System Prompt

You are a friendly assistant that creates personalized greetings.

# User Prompt

Create a {{style}} greeting for {{name}}.

{{#if occasion}}
The occasion is: {{occasion}}
{{/if}}

# User Prompt 2 [if includeWeather]

Also mention that it's a {{weather}} day.

3. Use the Template

const result = await promptly.generate('greeting', {
  name: 'Alice',
  style: 'warm',
  occasion: 'birthday',
  includeWeather: true,
  weather: 'sunny'
});

console.log(result.content);
// Output: A warm birthday greeting for Alice mentioning the sunny weather

Template Format

Templates use Markdown with YAML frontmatter:

---
name: "template-name"
description: "Template description"
model: "gpt-4-turbo"
temperature: 0.8
maxTokens: 500
maxThinkingTokens: 2000
version: "1.0.0"
---

# System Prompt
Your system instructions here...

# User Prompt
Your user message with {{variables}}

# Assistant Prompt [if needsResponse]
Assistant response (conditional)

# User Prompt 2
Follow-up user message

Conditional Prompts

Control which prompts are included using conditions:

# User Prompt [if hasContext]
Additional context: {{context}}

# Assistant Prompt [if (eq mode "detailed")]
I'll provide a detailed response.

# User Prompt 2 [if (and hasExamples (ne format "simple"))]
Here are examples: {{examples}}

Supported Conditions

  • Simple variables: [if variableName]
  • Equality: [if (eq value "expected")]
  • Inequality: [if (ne value "unwanted")]
  • Logical AND: [if (and condition1 condition2)]
  • Logical OR: [if (or condition1 condition2)]
  • Negation: [if (not condition)]
  • Comparisons: [if (gt number 5)], [if (lte score 100)]

API Reference

Core Classes

Promptly

Main class for managing templates and generating content.

const promptly = new Promptly({
  templatesPath: './prompts',
  adapters: [new OpenAIAdapter()],
  defaultModel: 'gpt-4-turbo',
  cache: true
});

// Generate content
const result = await promptly.generate('template-name', context);

// Just render template (without LLM call)
const rendered = await promptly.renderTemplate('template-name', context);

// Utility methods
const templates = await promptly.getAvailableTemplates();
const exists = await promptly.templateExists('template-name');
promptly.clearCache();

LLM Adapters

OpenAIAdapter
import { OpenAIAdapter } from 'promptly-ai';

// With environment variable OPENAI_API_KEY
const adapter = new OpenAIAdapter();

// With explicit API key
const adapter = new OpenAIAdapter('your-api-key');

// Supports models matching patterns: /^gpt-/i, /^o\d+/i
// Examples: gpt-4, gpt-4-turbo, gpt-3.5-turbo, o1, o3, etc.
AnthropicAdapter
import { AnthropicAdapter } from 'promptly-ai';

// With environment variable ANTHROPIC_API_KEY
const adapter = new AnthropicAdapter();

// With explicit API key
const adapter = new AnthropicAdapter('your-api-key');

// Supports models matching pattern: /^claude-/i
// Examples: claude-3-5-sonnet, claude-opus, etc.

Custom Adapters

Create custom adapters for other LLM providers:

import { LLMAdapter, LLMConfig, LLMUsageResult } from 'promptly-ai';

class CustomAdapter implements LLMAdapter {
  name = 'custom';

  supportsModel(model: string): boolean {
    return model.startsWith('custom-');
  }

  async generate(config: LLMConfig): Promise<LLMUsageResult> {
    // Your implementation here
    return {
      content: 'Generated content',
      provider: this.name,
      model: config.model,
      inputTokens: 100,
      outputTokens: 50,
      costUsd: 0.001
    };
  }
}

promptly.addAdapter(new CustomAdapter());

Validation

Add validation rules for template contexts:

promptly.setValidationSchema({
  'greeting': [
    { field: 'name', required: true, type: 'string' },
    { field: 'style', required: true, type: 'string' },
    { field: 'age', type: 'number', validator: (age) => age > 0 }
  ]
});

Handlebars Helpers

Built-in helpers for templates:

{{! Comparison }}
{{#if (eq status "active")}}Active user{{/if}}
{{#if (gt score 80)}}High score!{{/if}}

{{! Logic }}
{{#if (and isLoggedIn hasPermission)}}Welcome{{/if}}
{{#if (or isAdmin isModerator)}}Admin panel{{/if}}

{{! String manipulation }}
{{uppercase name}}
{{capitalize title}}
{{lowercase email}}

{{! Arrays and loops }}
{{#if (gt (length items) 0)}}
  Items: {{join items ", "}}
{{/if}}

{{#each items}}
  Item {{@index}}: {{this}}
  Item {{add @index 1}}: {{this}} {{! 1-based numbering }}
{{/each}}

{{! Math operations }}
{{add 5 3}} {{! Returns 8 }}
{{subtract 10 4}} {{! Returns 6 }}
{{multiply 3 7}} {{! Returns 21 }}
{{divide 15 3}} {{! Returns 5 }}

{{! Type checking }}
{{#if (isString value)}}String value{{/if}}
{{#if (isArray items)}}Array of items{{/if}}

{{! Utilities }}
{{default value "fallback"}}

Advanced Loop Examples

{{! Loop with 1-based indexing }}
{{#each sections}}
  Section {{add @index 1}}: {{this.title}}
  Content: {{this.content}}
{{/each}}

{{! Nested object access in loops }}
{{#each sectionContext.previousSections}}
  [Previous Section {{add @index 1}}]: {{this}}
{{/each}}

{{! Conditional content in loops }}
{{#each users}}
  {{#if (eq @index 0)}}First user: {{/if}}
  {{name}} ({{add @index 1}} of {{length ../users}})
{{/each}}

Project Integration

Directory Structure

your-project/
├── prompts/
│   ├── greeting.md
│   ├── summarization.md
│   └── analysis.md
├── src/
│   └── llm-service.ts
└── package.json

Service Integration

// src/llm-service.ts
import { createPromptly, OpenAIAdapter } from 'promptly-ai';
import path from 'path';

export class LLMService {
  private promptly;

  constructor() {
    this.promptly = createPromptly({
      templatesPath: path.join(process.cwd(), 'prompts'),
      defaultModel: 'gpt-4-turbo',
      adapters: [new OpenAIAdapter()],
    });
  }

  async generateGreeting(name: string, style: string) {
    return this.promptly.generate('greeting', { name, style });
  }

  async summarizeText(text: string, maxLength: number) {
    return this.promptly.generate('summarization', { text, maxLength });
  }
}

Environment Variables

Set up your API keys:

# .env
OPENAI_API_KEY=your-openai-key
ANTHROPIC_API_KEY=your-anthropic-key

Best Practices

  1. Organize Templates: Group related templates in subdirectories
  2. Version Templates: Use the version field in frontmatter
  3. Validate Context: Define validation schemas for important templates
  4. Cache Management: Clear cache during development, enable in production
  5. Error Handling: Wrap LLM calls in try-catch blocks
  6. Model Selection: Specify models in templates for consistency
  7. Testing: Test templates with various context combinations

Examples

Multi-turn Conversation

---
name: "interview"
model: "gpt-4-turbo"
---

# System Prompt
You are conducting a job interview.

# User Prompt
I'm interviewing for a {{position}} role.

# Assistant Prompt [if needsIntroduction]
Welcome! Let's start with some basic questions.

# User Prompt 2
Tell me about your experience with {{technology}}.

# Assistant Prompt 2 [if (eq difficulty "advanced")]
Let's dive into some advanced topics.

# User Prompt 3 [if hasAdvancedQuestions]
{{advancedQuestion}}

Content Generation Pipeline

class ContentPipeline {
  constructor(private promptly: Promptly) {}

  async generateArticle(topic: string) {
    // Step 1: Generate outline
    const outline = await this.promptly.generate('outline', { topic });
    
    // Step 2: Generate introduction
    const intro = await this.promptly.generate('introduction', { 
      topic, 
      outline: outline.content 
    });
    
    // Step 3: Generate sections
    const sections = await this.promptly.generate('sections', {
      topic,
      outline: outline.content,
      introduction: intro.content
    });
    
    return {
      outline: outline.content,
      introduction: intro.content,
      sections: sections.content
    };
  }
}

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes
  4. Add tests
  5. Run linting: npm run lint
  6. Submit a pull request

License

MIT License - see LICENSE file for details.

Support