promptly-ai
v1.3.1
Published
A universal template-based prompt management system for LLM applications
Downloads
22
Maintainers
Readme
Promptly AI
A universal template-based prompt management system for LLM applications. Promptly AI separates prompts from business logic, making them easier to maintain, version, and modify across different projects.
Features
- 🎯 Template-based prompts - Separate prompts from code
- 🔄 Multi-turn conversations - Support for dialogue flows
- ⚡ Conditional prompts - Include/exclude prompts based on context
- 🧠 Extended thinking - Support for reasoning tokens
- 🔌 Extensible adapters - Easy integration with different LLM providers
- 📝 Handlebars templating - Powerful variable interpolation and logic
- ✅ Type-safe - Full TypeScript support
- � Performance - Built-in caching for templates
- �🛡️ Validation - Context validation with custom rules
Installation
npm install promptly-aiPeer Dependencies
Install the LLM SDKs you plan to use:
# For OpenAI
npm install openai
# For Anthropic
npm install @anthropic-ai/sdkQuick Start
1. Setup
import { createPromptly, OpenAIAdapter, AnthropicAdapter } from 'promptly-ai';
const promptly = createPromptly({
templatesPath: './prompts',
defaultModel: 'gpt-4-turbo',
adapters: [
new OpenAIAdapter(), // Uses OPENAI_API_KEY env var
new AnthropicAdapter(), // Uses ANTHROPIC_API_KEY env var
],
});2. Create a Template
Create ./prompts/greeting.md:
---
name: "greeting"
description: "Generate personalized greetings"
model: "gpt-4-turbo"
temperature: 0.7
maxTokens: 100
---
# System Prompt
You are a friendly assistant that creates personalized greetings.
# User Prompt
Create a {{style}} greeting for {{name}}.
{{#if occasion}}
The occasion is: {{occasion}}
{{/if}}
# User Prompt 2 [if includeWeather]
Also mention that it's a {{weather}} day.3. Use the Template
const result = await promptly.generate('greeting', {
name: 'Alice',
style: 'warm',
occasion: 'birthday',
includeWeather: true,
weather: 'sunny'
});
console.log(result.content);
// Output: A warm birthday greeting for Alice mentioning the sunny weatherTemplate Format
Templates use Markdown with YAML frontmatter:
---
name: "template-name"
description: "Template description"
model: "gpt-4-turbo"
temperature: 0.8
maxTokens: 500
maxThinkingTokens: 2000
version: "1.0.0"
---
# System Prompt
Your system instructions here...
# User Prompt
Your user message with {{variables}}
# Assistant Prompt [if needsResponse]
Assistant response (conditional)
# User Prompt 2
Follow-up user messageConditional Prompts
Control which prompts are included using conditions:
# User Prompt [if hasContext]
Additional context: {{context}}
# Assistant Prompt [if (eq mode "detailed")]
I'll provide a detailed response.
# User Prompt 2 [if (and hasExamples (ne format "simple"))]
Here are examples: {{examples}}Supported Conditions
- Simple variables:
[if variableName] - Equality:
[if (eq value "expected")] - Inequality:
[if (ne value "unwanted")] - Logical AND:
[if (and condition1 condition2)] - Logical OR:
[if (or condition1 condition2)] - Negation:
[if (not condition)] - Comparisons:
[if (gt number 5)],[if (lte score 100)]
API Reference
Core Classes
Promptly
Main class for managing templates and generating content.
const promptly = new Promptly({
templatesPath: './prompts',
adapters: [new OpenAIAdapter()],
defaultModel: 'gpt-4-turbo',
cache: true
});
// Generate content
const result = await promptly.generate('template-name', context);
// Just render template (without LLM call)
const rendered = await promptly.renderTemplate('template-name', context);
// Utility methods
const templates = await promptly.getAvailableTemplates();
const exists = await promptly.templateExists('template-name');
promptly.clearCache();LLM Adapters
OpenAIAdapter
import { OpenAIAdapter } from 'promptly-ai';
// With environment variable OPENAI_API_KEY
const adapter = new OpenAIAdapter();
// With explicit API key
const adapter = new OpenAIAdapter('your-api-key');
// Supports models matching patterns: /^gpt-/i, /^o\d+/i
// Examples: gpt-4, gpt-4-turbo, gpt-3.5-turbo, o1, o3, etc.AnthropicAdapter
import { AnthropicAdapter } from 'promptly-ai';
// With environment variable ANTHROPIC_API_KEY
const adapter = new AnthropicAdapter();
// With explicit API key
const adapter = new AnthropicAdapter('your-api-key');
// Supports models matching pattern: /^claude-/i
// Examples: claude-3-5-sonnet, claude-opus, etc.Custom Adapters
Create custom adapters for other LLM providers:
import { LLMAdapter, LLMConfig, LLMUsageResult } from 'promptly-ai';
class CustomAdapter implements LLMAdapter {
name = 'custom';
supportsModel(model: string): boolean {
return model.startsWith('custom-');
}
async generate(config: LLMConfig): Promise<LLMUsageResult> {
// Your implementation here
return {
content: 'Generated content',
provider: this.name,
model: config.model,
inputTokens: 100,
outputTokens: 50,
costUsd: 0.001
};
}
}
promptly.addAdapter(new CustomAdapter());Validation
Add validation rules for template contexts:
promptly.setValidationSchema({
'greeting': [
{ field: 'name', required: true, type: 'string' },
{ field: 'style', required: true, type: 'string' },
{ field: 'age', type: 'number', validator: (age) => age > 0 }
]
});Handlebars Helpers
Built-in helpers for templates:
{{! Comparison }}
{{#if (eq status "active")}}Active user{{/if}}
{{#if (gt score 80)}}High score!{{/if}}
{{! Logic }}
{{#if (and isLoggedIn hasPermission)}}Welcome{{/if}}
{{#if (or isAdmin isModerator)}}Admin panel{{/if}}
{{! String manipulation }}
{{uppercase name}}
{{capitalize title}}
{{lowercase email}}
{{! Arrays and loops }}
{{#if (gt (length items) 0)}}
Items: {{join items ", "}}
{{/if}}
{{#each items}}
Item {{@index}}: {{this}}
Item {{add @index 1}}: {{this}} {{! 1-based numbering }}
{{/each}}
{{! Math operations }}
{{add 5 3}} {{! Returns 8 }}
{{subtract 10 4}} {{! Returns 6 }}
{{multiply 3 7}} {{! Returns 21 }}
{{divide 15 3}} {{! Returns 5 }}
{{! Type checking }}
{{#if (isString value)}}String value{{/if}}
{{#if (isArray items)}}Array of items{{/if}}
{{! Utilities }}
{{default value "fallback"}}Advanced Loop Examples
{{! Loop with 1-based indexing }}
{{#each sections}}
Section {{add @index 1}}: {{this.title}}
Content: {{this.content}}
{{/each}}
{{! Nested object access in loops }}
{{#each sectionContext.previousSections}}
[Previous Section {{add @index 1}}]: {{this}}
{{/each}}
{{! Conditional content in loops }}
{{#each users}}
{{#if (eq @index 0)}}First user: {{/if}}
{{name}} ({{add @index 1}} of {{length ../users}})
{{/each}}Project Integration
Directory Structure
your-project/
├── prompts/
│ ├── greeting.md
│ ├── summarization.md
│ └── analysis.md
├── src/
│ └── llm-service.ts
└── package.jsonService Integration
// src/llm-service.ts
import { createPromptly, OpenAIAdapter } from 'promptly-ai';
import path from 'path';
export class LLMService {
private promptly;
constructor() {
this.promptly = createPromptly({
templatesPath: path.join(process.cwd(), 'prompts'),
defaultModel: 'gpt-4-turbo',
adapters: [new OpenAIAdapter()],
});
}
async generateGreeting(name: string, style: string) {
return this.promptly.generate('greeting', { name, style });
}
async summarizeText(text: string, maxLength: number) {
return this.promptly.generate('summarization', { text, maxLength });
}
}Environment Variables
Set up your API keys:
# .env
OPENAI_API_KEY=your-openai-key
ANTHROPIC_API_KEY=your-anthropic-keyBest Practices
- Organize Templates: Group related templates in subdirectories
- Version Templates: Use the
versionfield in frontmatter - Validate Context: Define validation schemas for important templates
- Cache Management: Clear cache during development, enable in production
- Error Handling: Wrap LLM calls in try-catch blocks
- Model Selection: Specify models in templates for consistency
- Testing: Test templates with various context combinations
Examples
Multi-turn Conversation
---
name: "interview"
model: "gpt-4-turbo"
---
# System Prompt
You are conducting a job interview.
# User Prompt
I'm interviewing for a {{position}} role.
# Assistant Prompt [if needsIntroduction]
Welcome! Let's start with some basic questions.
# User Prompt 2
Tell me about your experience with {{technology}}.
# Assistant Prompt 2 [if (eq difficulty "advanced")]
Let's dive into some advanced topics.
# User Prompt 3 [if hasAdvancedQuestions]
{{advancedQuestion}}Content Generation Pipeline
class ContentPipeline {
constructor(private promptly: Promptly) {}
async generateArticle(topic: string) {
// Step 1: Generate outline
const outline = await this.promptly.generate('outline', { topic });
// Step 2: Generate introduction
const intro = await this.promptly.generate('introduction', {
topic,
outline: outline.content
});
// Step 3: Generate sections
const sections = await this.promptly.generate('sections', {
topic,
outline: outline.content,
introduction: intro.content
});
return {
outline: outline.content,
introduction: intro.content,
sections: sections.content
};
}
}Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests
- Run linting:
npm run lint - Submit a pull request
License
MIT License - see LICENSE file for details.
