npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, πŸ‘‹, I’m Ryan HefnerΒ  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you πŸ™

Β© 2026 – Pkg Stats / Ryan Hefner

prompt-debug

v1.0.7

Published

πŸ”¬ Open-source Prompt Debugger for LLMs β€” token counter, cost tracker (β‚Ή/$), streaming, multi-provider (Groq, OpenAI, Together AI)

Downloads

644

Readme

⬑ prompt-debug

πŸ”¬ Chrome DevTools, but for AI Prompts β€” Debug, Analyze & Compare LLM prompts in real-time

npm version License: MIT Node.js PRs Welcome


πŸ€” The Problem

Every AI developer faces this:

Write prompt β†’ Run β†’ Output is wrong
       ↓
"Why did this happen?" β€” No idea
       ↓
Guess β†’ Change β†’ Run again β†’ Still wrong
       ↓
Wasted time + Wasted money + Frustration 😀

There are no proper debugging tools for LLM prompts.

  • You don't know how many tokens your prompt uses
  • You don't know how much it costs per request
  • You can't compare two prompt versions side by side
  • You can't track your experiments over a session

prompt-debug solves all of this.


✨ Features

| Feature | Description | |---|---| | ⚑ Real-time Streaming | See responses word-by-word, just like ChatGPT | | πŸ”€ Token Counter | Exact input + output token counts per request | | πŸ’Έ Cost Tracker | Cost in both β‚Ή INR and $ USD per request | | ⏱ Latency Tracker | Response time in milliseconds | | β‡Œ Prompt Compare | A/B test two prompts side by side | | πŸ”₯ Prompt Heatmap | Visualize which words have the most impact | | πŸ“‹ Session History | All your runs saved with full details | | 🌍 Multi-Provider | Groq, OpenAI, Together AI β€” switch instantly | | πŸ”Œ WebSocket | Real-time connection, no page refresh needed | | πŸ” Secure | API keys stored locally in browser, never on any server |


πŸš€ Quick Start

# Install globally
npm install -g prompt-debug

# Start the debugger
prompt-debug

Browser opens automatically at http://localhost:3000 πŸŽ‰

That's it. No config files. No setup. Just run and debug.


πŸ“¦ Installation

Global (Recommended)

npm install -g prompt-debug

Local (Project-specific)

npm install prompt-debug
npx prompt-debug

Requirements

  • Node.js >= 18.0.0
  • A free API key from any supported provider

🎯 Usage

# Start on default port 3000
prompt-debug

# Start on custom port
prompt-debug --port 8080

# Start without opening browser
prompt-debug --no-open

# Show version
prompt-debug --version

# Show help
prompt-debug --help

Use as a Node.js Module

import { startServer } from "prompt-debug";

startServer({
  port: 3000,
  openBrowser: true,
});

πŸ”‘ Supported Providers

| Provider | Free Tier | Get API Key | |---|---|---| | ⚑ Groq | βœ… Yes β€” Very generous | console.groq.com/keys | | β—Ž OpenAI | ❌ Paid | platform.openai.com/api-keys | | ⬑ Together AI | βœ… Yes β€” $25 free credits | api.together.ai |

πŸ’‘ Recommended for beginners: Start with Groq β€” it's completely free and blazing fast!


πŸ€– Supported Models

Groq (Free ⚑)

  • llama-3.3-70b-versatile β€” Best quality
  • llama-3.1-8b-instant β€” Fastest
  • gemma2-9b-it β€” Google's Gemma
  • mixtral-8x7b-32768 β€” Great for code

OpenAI

  • gpt-4o β€” Most capable
  • gpt-4o-mini β€” Fast & cheap
  • gpt-3.5-turbo β€” Classic
  • o1, o1-mini β€” Reasoning models

Together AI (Free tier)

  • meta-llama/Llama-3-70b-chat-hf
  • mistralai/Mixtral-8x7B-Instruct-v0.1
  • 100+ open-source models

πŸ”„ Models are auto-fetched from your account when you enter your API key β€” you always get the latest available models!


πŸ’° Cost Tracking

prompt-debug shows you the exact cost of every request:

Prompt: "Explain React hooks in simple terms"
Model:  llama-3.3-70b-versatile (Groq)

Input tokens:  12      β†’ β‚Ή0.0000
Output tokens: 284     β†’ β‚Ή0.0019
Total cost:    β‚Ή0.0019 ($0.000023)
Latency:       342ms

Monthly estimate example:

10 users Γ— 50 requests/day Γ— 30 days = 15,000 requests
Average cost per request: β‚Ή0.002
Monthly total: β‚Ή30 (~$0.36)

Now you can budget your AI features properly! πŸ’‘


πŸ—οΈ Architecture

Browser (React UI)
      ↕  WebSocket (real-time)
Node.js Backend (Express)
      ↕  HTTPS
AI Provider (Groq / OpenAI / Together)

Why WebSocket?

| Regular HTTP | WebSocket | |---|---| | Wait for full response | Stream word by word | | One request, one response | Persistent connection | | Slow feel | Real-time feel βœ… |


πŸ“ Project Structure

prompt-debug/
β”œβ”€β”€ bin/
β”‚   └── cli.js          ← CLI entry point (prompt-debug command)
β”œβ”€β”€ index.js            ← Server + WebSocket + All providers
β”œβ”€β”€ package.json
└── README.md

πŸ› οΈ Development

# Clone the repo
git clone https://github.com/yourusername/prompt-debug
cd prompt-debug

# Install dependencies
npm install

# Start in dev mode (auto-restart on changes)
npm run dev

# Build frontend
cd frontend && npm run build

🀝 Contributing

Contributions are welcome! Here's how:

  1. Fork the repository
  2. Create a feature branch β€” git checkout -b feature/amazing-feature
  3. Commit your changes β€” git commit -m "Add amazing feature"
  4. Push to the branch β€” git push origin feature/amazing-feature
  5. Open a Pull Request

Ideas for Contributions

  • [ ] Anthropic Claude support
  • [ ] Cohere support
  • [ ] Export history as CSV/JSON
  • [ ] Real token heatmap using logprobs
  • [ ] Dark/Light theme toggle
  • [ ] Prompt templates library
  • [ ] Monthly cost dashboard with charts
  • [ ] Docker support

πŸ› Bug Reports

Found a bug? Open an issue with:

  • Your Node.js version (node --version)
  • Your OS (Windows/Mac/Linux)
  • Steps to reproduce
  • Expected vs actual behavior

πŸ“Š Comparison with Other Tools

| Feature | prompt-debug | LangSmith | OpenAI Playground | Postman | |---|---|---|---|---| | Free | βœ… 100% Free | ❌ Paid | ❌ Limited | ❌ Limited | | Open Source | βœ… | ❌ | ❌ | ❌ | | Groq Support | βœ… | ❌ | ❌ | βœ… | | INR Cost | βœ… | ❌ | ❌ | ❌ | | Streaming | βœ… | βœ… | βœ… | ❌ | | Prompt Compare | βœ… | βœ… | ❌ | ❌ | | Local Install | βœ… | ❌ | ❌ | βœ… | | Setup Time | 30 sec | 10 min | Instant | 5 min |


πŸ“œ Changelog

v1.0.0 (2024)

  • πŸŽ‰ Initial release
  • βœ… Groq, OpenAI, Together AI support
  • βœ… Real-time WebSocket streaming
  • βœ… Token counter + Cost tracker (β‚Ή + $)
  • βœ… Prompt A/B comparison
  • βœ… Session history
  • βœ… Prompt heatmap

πŸ“„ License

MIT Β© Ankur Ojha

Free to use, modify, and distribute. See LICENSE for details.


⭐ Support

If this project helped you, please consider:

  • ⭐ Starring the repo on GitHub
  • 🐦 Sharing on Twitter/LinkedIn
  • πŸ› Reporting bugs and issues
  • 🀝 Contributing new features

πŸ‘¨β€πŸ’» Author

Built with ❀️ by Ankur Ojha

"Built an open-source Prompt Debugger for LLMs with real-time token analysis, INR cost tracking, WebSocket streaming, and multi-provider support (Groq/OpenAI/Together AI)"