textprompts
v0.3.0
Published
TypeScript companion to textprompts for loading and formatting prompt files.
Downloads
1,660
Readme
textprompts
So simple, it's not even worth vibe coding yet it just makes so much sense.
TypeScript/JavaScript companion to textprompts for loading and formatting prompt files.
Are you tired of vendors trying to sell you fancy UIs for prompt management that just make your system more confusing and harder to debug? Isn't it nice to just have your prompts next to your code?
But then you worry: Did my formatter change my prompt? Are those spaces at the beginning actually part of the prompt or just indentation?
textprompts solves this elegantly: treat your prompts as text files and keep your linters and formatters away from them. And you get prompt metadata headers for free!
Why textprompts?
- ✅ Prompts live next to your code - no external systems to manage
- ✅ Git is your version control - diff, branch, and experiment with ease
- ✅ No formatter headaches - your prompts stay exactly as you wrote them
- ✅ Minimal markup - just TOML front-matter when you need metadata (or no metadata if you prefer!)
- ✅ Lightweight dependencies - minimal footprint with just fast-glob and TOML parser
- ✅ Safe formatting - catch missing variables before they cause problems
- ✅ Works with everything - OpenAI, Anthropic, local models, function calls
- ✅ Node.js & Bun compatible - works seamlessly with both runtimes
- ✅ Dual ESM/CJS build support - works with both module systems
Installation
# With npm
npm install textprompts
# With Bun
bun add textprompts
# With pnpm
pnpm add textpromptsQuick Start
Super simple by default - TextPrompts just loads text files with optional metadata:
Loading from Files
- Create a prompt file (
greeting.txt):
---
title = "Customer Greeting"
version = "1.0.0"
description = "Friendly greeting for customer support"
---
Hello {customer_name}!
Welcome to {company_name}. We're here to help you with {issue_type}.
Best regards,
{agent_name}- Load and use it (no configuration needed):
import { loadPrompt } from "textprompts";
// Just load it - works with or without metadata
const prompt = await loadPrompt("greeting.txt");
// Or use the static method
const alt = await Prompt.fromPath("greeting.txt");
// Use it safely - all placeholders must be provided
const message = prompt.prompt.format({
customer_name: "Alice",
company_name: "ACME Corp",
issue_type: "billing question",
agent_name: "Sarah"
});
console.log(message);
// Or use partial formatting when needed
const partial = prompt.prompt.format(
{ customer_name: "Alice", company_name: "ACME Corp" },
{ skipValidation: true }
);
// Result: "Hello Alice!\n\nWelcome to ACME Corp. We're here to help you with {issue_type}.\n\nBest regards,\n{agent_name}"
// Prompt objects expose `.meta` and `.prompt`.
// Use `prompt.prompt.format()` for safe formatting or `String(prompt)` for raw text.Even simpler - no metadata required:
// simple_prompt.txt contains just: "Analyze this data: {data}"
const prompt = await loadPrompt("simple_prompt.txt"); // Just works!
const result = prompt.prompt.format({ data: "sales figures" });Loading from Strings (for Bundlers)
Problem: Modern bundlers (Vite, Webpack, Rollup) often don't include .txt files in your bundle by default.
Solution: Load prompts directly from strings using Prompt.fromString():
import { Prompt } from "textprompts";
// Vite: Use ?raw suffix to import as string
import greetingContent from "./greeting.txt?raw";
// Or with Webpack using raw-loader
// import greetingContent from "raw-loader!./greeting.txt";
// Load from the string content
const prompt = Prompt.fromString(greetingContent);
// Works identically to file-based loading
const message = prompt.format({
customer_name: "Alice",
company_name: "ACME Corp",
issue_type: "billing question",
agent_name: "Sarah"
});With metadata support:
import promptContent from "./system-prompt.txt?raw";
// The ?raw import includes TOML front-matter if present
const prompt = Prompt.fromString(promptContent, {
meta: "allow", // or MetadataMode.ALLOW
path: "system-prompt.txt" // Optional: for better error messages
});
console.log(prompt.meta?.title); // Access metadata
console.log(prompt.meta?.version); // Works like fromPathWhen to use fromString vs fromPath:
- Use
fromPath()for Node.js/Bun server-side code - Use
fromString()for bundled frontend code (Vite, Webpack, etc.) - Use
fromString()when loading prompts from APIs or databases
Core Features
Safe String Formatting
Never ship a prompt with missing variables again:
import { PromptString } from "textprompts";
const template = new PromptString("Hello {name}, your order {order_id} is {status}");
// ✅ Strict formatting - all placeholders must be provided
const result = template.format({ name: "Alice", order_id: "12345", status: "shipped" });
// ❌ This catches the error by default
try {
template.format({ name: "Alice" }); // Missing order_id and status
} catch (error) {
console.error(error.message); // Missing format variables: ["order_id", "status"]
}
// ✅ Partial formatting - replace only what you have
const partial = template.format(
{ name: "Alice" },
{ skipValidation: true }
);
console.log(partial); // "Hello Alice, your order {order_id} is {status}"Bulk Loading
Load entire directories of prompts:
import { loadPrompts } from "textprompts";
// Load all prompts from a directory
const prompts = await loadPrompts("prompts/", { recursive: true });
// Create a lookup
const promptMap = new Map(
prompts.map(p => [p.meta?.title ?? 'Untitled', p])
);
const greeting = promptMap.get("Customer Greeting");Simple & Flexible Metadata Handling
TextPrompts is designed to be super simple by default - just load text files with optional metadata when available. No configuration needed!
import { loadPrompt, setMetadata, MetadataMode } from "textprompts";
// Default behavior: load metadata if available, otherwise just use the file content
const prompt = await loadPrompt("my_prompt.txt"); // Just works!
// Three modes available for different use cases:
// 1. ALLOW (default): Load metadata if present, don't worry if it's incomplete
setMetadata(MetadataMode.ALLOW); // Flexible metadata loading (default)
const flexible = await loadPrompt("prompt.txt"); // Loads any metadata found
// 2. IGNORE: Treat as simple text file, use filename as title
setMetadata(MetadataMode.IGNORE); // Super simple file loading
const simple = await loadPrompt("prompt.txt"); // No metadata parsing
console.log(simple.meta?.title); // "prompt" (from filename)
// 3. STRICT: Require complete metadata for production use
setMetadata(MetadataMode.STRICT); // Prevent errors in production
const strict = await loadPrompt("prompt.txt"); // Must have title, description, version
// Override per prompt when needed
const override = await loadPrompt("prompt.txt", { meta: "strict" });Why this design?
- Default = Flexible: Parse metadata if present, no friction if absent
- No configuration needed: Just load files and it works
- Production-Safe: Use strict mode to catch missing metadata before deployment
Real-World Examples
OpenAI Integration
import OpenAI from "openai";
import { loadPrompt } from "textprompts";
const systemPrompt = await loadPrompt("prompts/system.txt");
const client = new OpenAI();
const response = await client.chat.completions.create({
model: "gpt-5-mini",
messages: [
{
role: "system",
content: systemPrompt.prompt.format({
company_name: "ACME Corp",
tone: "professional"
})
},
{ role: "user", content: "Hello!" }
]
});Vercel AI SDK Integration
import { openai } from '@ai-sdk/openai';
import { streamText } from 'ai';
import { loadPrompt } from "textprompts";
const systemPrompt = await loadPrompt("prompts/system.txt");
const result = streamText({
model: openai('gpt-5-mini'),
messages: [
{
role: 'system',
content: systemPrompt.prompt.format({
company_name: "ACME Corp",
tone: "friendly"
})
},
{ role: 'user', content: 'Hello!' }
]
});
for await (const delta of result.textStream) {
process.stdout.write(delta);
}Anthropic Claude Integration
import Anthropic from "@anthropic-ai/sdk";
import { loadPrompt } from "textprompts";
const systemPrompt = await loadPrompt("prompts/system.txt");
const anthropic = new Anthropic();
const message = await anthropic.messages.create({
model: "claude-3-5-sonnet-20241022",
max_tokens: 1024,
system: systemPrompt.prompt.format({
company_name: "ACME Corp",
tone: "professional"
}),
messages: [
{ role: "user", content: "Hello!" }
]
});Environment-Specific Prompts
import { loadPrompt } from "textprompts";
const env = process.env.NODE_ENV || "development";
const systemPrompt = await loadPrompt(`prompts/${env}/system.txt`);
// prompts/development/system.txt - verbose logging
// prompts/production/system.txt - concise responsesPrompt Versioning & Experimentation
import { loadPrompt } from "textprompts";
// Easy A/B testing
const promptVersion = "v2"; // or "v1", "experimental", etc.
const prompt = await loadPrompt(`prompts/${promptVersion}/system.txt`);
// Git handles the rest:
// git checkout experiment-branch
// git diff main -- prompts/File Format
TextPrompts uses TOML front-matter (optional) followed by your prompt content:
---
title = "My Prompt"
version = "1.0.0"
author = "Your Name"
description = "What this prompt does"
created = "2024-01-15"
---
Your prompt content goes here.
Use {variables} for templating.Metadata Modes
Choose the right level of strictness for your use case:
- ALLOW (default) - Load metadata if present, don't worry about completeness
- IGNORE - Simple text file loading, filename becomes title
- STRICT - Require complete metadata (title, description, version) for production safety
You can also set the environment variable TEXTPROMPTS_METADATA_MODE to one of
strict, allow, or ignore before importing the library to configure the
default mode.
import { setMetadata, MetadataMode } from "textprompts";
// Set globally
setMetadata(MetadataMode.ALLOW); // Default: flexible metadata loading
setMetadata(MetadataMode.IGNORE); // Simple: no metadata parsing
setMetadata(MetadataMode.STRICT); // Production: require complete metadata
// Or override per prompt
const prompt = await loadPrompt("file.txt", { meta: "strict" });API Reference
loadPrompt(path, options?)
Load a single prompt file.
async function loadPrompt(
path: string,
options?: {
meta?: MetadataMode | string | null;
}
): Promise<Prompt>path: Path to the prompt filemeta: Metadata handling mode -MetadataMode.STRICT,MetadataMode.ALLOW,MetadataMode.IGNORE, or string equivalents.nulluses global config.
Returns a Prompt object with:
prompt.meta: Metadata from TOML front-matter (always present)prompt.prompt: The prompt content as aPromptStringprompt.path: Path to the original file
loadPrompts(paths, options?)
Load multiple prompts from files or directories.
async function loadPrompts(
paths: string | string[],
options?: {
recursive?: boolean;
glob?: string;
meta?: MetadataMode | string | null;
maxFiles?: number | null;
}
): Promise<Prompt[]>paths: File or directory path(s) to loadrecursive: Search directories recursively (default:false)glob: File pattern to match (default:"*.txt")meta: Metadata handling modemaxFiles: Maximum files to process (default:1000)
setMetadata(mode) / getMetadata()
Set or get the global metadata handling mode.
function setMetadata(mode: MetadataMode | string): void
function getMetadata(): MetadataModemode:MetadataMode.STRICT,MetadataMode.ALLOW,MetadataMode.IGNORE, or string equivalents
savePrompt(path, content)
Save a prompt to a file.
async function savePrompt(
path: string,
content: string | Prompt
): Promise<void>path: Path to save the prompt filecontent: Either a string (creates template with required fields) or aPromptobject
PromptString
A string wrapper that validates format() calls:
class PromptString {
readonly value: string;
readonly placeholders: Set<string>;
constructor(value: string);
format(options?: FormatOptions): string;
format(args: unknown[], kwargs?: Record<string, unknown>, options?: FormatCallOptions): string;
toString(): string;
valueOf(): string;
strip(): string;
slice(start?: number, end?: number): string;
get length(): number;
}
interface FormatOptions {
args?: unknown[];
kwargs?: Record<string, unknown>;
skipValidation?: boolean;
}Examples:
import { PromptString } from "textprompts";
const template = new PromptString("Hello {name}, you are {role}");
// Strict formatting (default) - all placeholders required
const result = template.format({ name: "Alice", role: "admin" }); // ✅ Works
// template.format({ name: "Alice" }); // ❌ Throws Error
// Partial formatting - replace only available placeholders
const partial = template.format(
{ name: "Alice" },
{ skipValidation: true }
); // ✅ "Hello Alice, you are {role}"
// Access placeholder information
console.log([...template.placeholders]); // ['name', 'role']Prompt
The main prompt object:
class Prompt {
readonly path: string;
readonly meta: PromptMeta | null;
readonly prompt: PromptString;
static async fromPath(path: string, options?: { meta?: MetadataMode | string | null }): Promise<Prompt>;
static fromString(content: string, options?: { path?: string; meta?: MetadataMode | string | null }): Prompt;
toString(): string;
valueOf(): string;
strip(): string;
format(options?: FormatOptions): string;
format(args: unknown[], kwargs?: Record<string, unknown>, options?: FormatCallOptions): string;
get length(): number;
slice(start?: number, end?: number): string;
}
interface PromptMeta {
title?: string | null;
version?: string | null;
author?: string | null;
created?: string | null;
description?: string | null;
}Prompt.fromString(content, options?)
Load a prompt from a string (useful for bundlers):
static fromString(
content: string,
options?: {
path?: string; // Optional path for metadata/error messages (default: "<string>")
meta?: MetadataMode | string | null; // Metadata mode (default: global config)
}
): Promptcontent: String containing the prompt (may include TOML front-matter)path: Optional path for better error messages and metadata extraction (defaults to"<string>")meta: Metadata handling mode (same asfromPath)
Returns a Prompt object with the same structure as fromPath.
Examples:
import { Prompt, MetadataMode } from "textprompts";
// Simple usage
const prompt = Prompt.fromString("Hello {name}!");
// With Vite raw import
import content from "./prompt.txt?raw";
const prompt = Prompt.fromString(content, { path: "prompt.txt" });
// With strict metadata validation
const prompt = Prompt.fromString(content, { meta: MetadataMode.STRICT });Error Handling
TextPrompts provides specific exception types:
import {
TextPromptsError, // Base exception
FileMissingError, // File not found
MissingMetadataError, // No TOML front-matter when required
InvalidMetadataError, // Invalid TOML syntax
MalformedHeaderError, // Malformed front-matter structure
} from "textprompts";Best Practices
Organize by purpose: Group related prompts in folders
prompts/ ├── customer-support/ ├── content-generation/ └── code-review/Use semantic versioning: Version your prompts like code
version = "1.2.0" # major.minor.patchDocument your variables: List expected variables in descriptions
description = "Requires: customer_name, issue_type, agent_name"Test your prompts: Write unit tests for critical prompts
import { test, expect } from "bun:test"; import { loadPrompt } from "textprompts"; test("greeting prompt formats correctly", async () => { const prompt = await loadPrompt("greeting.txt"); const result = prompt.prompt.format({ customer_name: "Test", company_name: "Test Corp", issue_type: "test", agent_name: "Bot" }); expect(result).toContain("Test"); });Use environment-specific prompts: Different prompts for dev/prod
const env = process.env.NODE_ENV || "development"; const prompt = await loadPrompt(`prompts/${env}/system.txt`);
Why Not Just Use Template Strings?
You could, but then you lose:
- Metadata tracking (versions, authors, descriptions)
- Safe formatting (catch missing variables)
- Organized storage (searchable, documentable)
- Version control benefits (proper diffs, blame, history)
- Tooling support (CLI, validation, testing)
Examples
See the examples/ directory for complete, runnable examples:
- basic-usage.ts - Core functionality demo
- fromstring-example.ts - Loading from strings for bundlers
- simple-format-demo.ts - PromptString features
- openai-example.ts - OpenAI integration
- aisdk-example.ts - Vercel AI SDK streaming chat
Run them with:
bun examples/basic-usage.ts
bun examples/fromstring-example.ts
bun examples/openai-example.ts
bun examples/aisdk-example.tsDocumentation
Full documentation is available in the docs/ directory:
License
MIT License - see LICENSE for details.
textprompts - Because your prompts deserve better than being buried in code strings. 🚀
