npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

type-prompt

v1.0.0

Published

A TypeScript library for LLM prompt templating with support for chat messages, function calling, and vision models

Readme

Type Prompt 🎭

A powerful TypeScript library for LLM prompt templating, inspired by Banks

npm version License: MIT PRs Welcome


🚀 Features

  • 📝 Template rendering - Create prompts using the Nunjucks templating engine
  • 💬 Chat messages - Easily generate chat-based prompts for modern LLMs
  • 🔧 Filters and extensions - Apply transformations to your prompt content
  • Caching - Efficiently render prompts by avoiding redundant processing
  • 🛠️ Tool calling - First-class support for function calling in LLMs
  • 🖼️ Vision support - Add images to prompts for multimodal models

📦 Installation

npm install type-prompt

🎯 Quick Start

Basic Prompt

import { Prompt } from "type-prompt";

const p = new Prompt("Write a 500-word blog post on {{ topic }}.");
console.log(p.text({ topic: "AI frameworks" }));

📚 Examples

Chat Messages

import { Prompt } from "type-prompt";

const p = new Prompt(`
{% chat role="system" %}
You are a {{ persona }}.
{% endchat %}

{% chat role="user" %}
Hello, how are you?
{% endchat %}
`);

const messages = p.chatMessages({ persona: "helpful assistant" });
// Output:
// [
//   { role: 'system', content: 'You are a helpful assistant.' },
//   { role: 'user', content: 'Hello, how are you?' }
// ]

Prompt Caching (for Anthropic)

import { Prompt } from "type-prompt";

const p = new Prompt(`
{% chat role="user" %}
Analyze this book:

{{ book | cache_control("ephemeral") }}

What is the title of this book? Only output the title.
{% endchat %}
`);

const messages = p.chatMessages({ book: "This is a short book!" });
// The book content will be wrapped in a special content block with cache_control

Function Calling

import { Prompt } from "type-prompt";

function getLaptopInfo() {
  /**
   * Get information about the user laptop.
   */
  return "MacBook Pro, macOS 12.3";
}

const p = new Prompt(`
{% chat role="user" %}
{{ query }}
{{ getLaptopInfo | tool }}
{% endchat %}
`);

const messages = p.chatMessages({
  query: "Can you guess the name of my laptop?",
  getLaptopInfo,
});
// The tool will be properly formatted for LLM function calling

�� API Reference

Prompt

The main class for creating and rendering prompts.

new Prompt(template: string, options?: {
  name?: string;
  version?: string;
  metadata?: Record<string, any>;
  canaryWord?: string;
  renderCache?: RenderCache;
})

Methods

  • text(data?: Record<string, any>): Render the prompt as plain text
  • chatMessages(data?: Record<string, any>): Render the prompt as an array of chat messages
  • canaryLeaked(text: string): Check if a canary word has leaked

AsyncPrompt

An asynchronous version of the Prompt class with the same API but providing Promise-based methods.

new AsyncPrompt(template: string, options?: {
  name?: string;
  version?: string;
  metadata?: Record<string, any>;
  canaryWord?: string;
  renderCache?: RenderCache;
})

Methods

  • text(data?: Record<string, any>): Returns a Promise that resolves to the rendered text
  • chatMessages(data?: Record<string, any>): Returns a Promise that resolves to an array of chat messages
  • canaryLeaked(text: string): Check if a canary word has leaked

Filters

  • cache_control(text: string, cacheType: string = "ephemeral"): Mark text for caching
  • image(source: string): Include an image in the prompt
  • tool(function: Function): Convert a function to a tool for function calling

Extensions

  • chat: Define a chat message block
  • completion: Generate text using an LLM during template rendering

🤝 Contributing

Contributions, issues, and feature requests are welcome! Feel free to check the issues page.

📝 License

This project is MIT licensed.