promptmd-cli
v1.2.0
Published
CLI tool for running, chaining, and looping prompts defined in markdown files
Maintainers
Readme
Promptmd
Promptmd is a CLI tool for running, chaining, and looping prompts defined in markdown files with real-time streaming support.
Features
- 🔗 Chain prompts together with simple syntax
- 📝 Markdown-based prompt definitions with frontmatter
- 🔄 Loop execution with exit conditions
- 🎯 Variable substitution with
{{variable}}syntax - 📊 Structured output via YAML frontmatter
- ⚡ Real-time streaming output
- 🔌 Multiple backends: OpenCode, subprocess, or custom implementations
Quick Start
Install globally:
npm install -g promptmd-cliSimple prompt
echo "Check the weather in Berlin" > weather.md
promd weatherChaining prompts
echo "Check the weather in Berlin" > weather.md
echo "Suggest activities in Berlin based on the provided weather forecast: {{input}}" > plan-activities.md
promd weather plan-activitiesUse arguments
echo "Check the weather in {{city}}" > weather.md
echo "Suggest activities in {{city}} based on the provided weather forecast: {{input}}" > plan-activities.md
promd weather plan-activities --city HamburgStructured output via frontmatter
weather.md
---
output:
temperature: "the forecasted temperature"
rain: "will it rain?"
---
Check the weather in Berlinecho "Suggest activities in Berlin. Temperature: {{input.temperature}} Rain: {{input.rain}}" > plan-activities.md
promd weather plan-activitiesChain all prompts in a directory
promd .Run in a loop
promd loop --count 10 --exitOn "Complete" ./my-workflowBackends
OpenCode Backend (Recommended)
Use OpenCode for AI-powered prompt execution with streaming:
# .promd
backend: opencode
opencode:
model: anthropic/claude-sonnet-4
thinking: trueSee OPENCODE_BACKEND.md for details.
Subprocess Backend
Integrate with external tools/scripts:
# .promd
backend: subprocess
subprocess:
command: python
args: ['run_llm.py']See STREAMING.md for details.
Mock Backend (Default)
For testing and development:
# .promd
backend: mockDocumentation
- DEVELOPMENT.md - Setup and development guide
- OPENCODE_BACKEND.md - OpenCode integration
- STREAMING.md - Streaming and custom backends
- examples/ - Example prompts and configurations
Configuration
Create ~/.promd or ./.promd:
backend: opencode
opencode:
model: anthropic/claude-sonnet-4
format: defaultConfig files are loaded hierarchically:
~/.promd(global)../../.promd,../.promd(parent directories)./.promd(current directory, highest priority)
Examples
See the examples/ directory for:
- Basic prompt files
- Chained workflows
- Structured output examples
- Backend integration examples
- Configuration examples
