reddit-demand-kit
v0.2.2
Published
MCP-first demand intelligence for Reddit. Discover pain points, track competitors, find what's worth building.
Downloads
644
Readme
Reddit Demand Kit
"The gap isn't finding threads. It's knowing what to say when you get there. Most tools stop at step 1." — r/SaaS, 2026
RDK is the step 2 tool. MCP-native demand intelligence that lets your AI agent do the Reddit research, classification, and outreach planning — without opening another dashboard tab.
Why This Exists
In November 2025, GummySearch — the category king with 135,000 users and 10,000 paying customers — died overnight when Reddit revoked their commercial API access. Four years of work, gone in a weekend. Every replacement tool since has been built on the same fragile foundation: a Reddit API key that can be pulled at any time.
RDK takes the opposite bet: RSS feeds. A 20-year open standard that needs no authentication, no API keys, no approval, and cannot be revoked. The same architecture that let blogs survive every platform shift is now the safest way to build on Reddit.
Four more things that make RDK different from the 30+ tools fighting over GummySearch's corpse:
- MCP-native, not a dashboard. Your AI agent runs RDK inside the conversation you're already having about your product. Zero context switches.
- Code does plumbing. Prompts do thinking. Tools fetch raw data. Analysis frameworks — demand scoring, competitor sentiment, market scans, outreach playbooks — are MCP prompts your agent executes with your context. No keyword-matching pseudo-AI.
- Free tier that actually works. RSS gives you content. Pro ($9/mo) adds real upvote scores, comment counts, and full comments via the Tap browser bridge — a second structural moat competitors would need years to build.
- Proprietary, on purpose. Code is compiled and obfuscated (the prompts are the real IP). Methodology is published freely. Read the long-form posts to learn how to think; install the tool when you want your agent to do it automatically.
Install
# Node / npm
npm install -g reddit-demand-kit
# Deno 2.0+
deno install -g --allow-all -n rdk npm:reddit-demand-kitRequires Node.js 16+ or Deno 2.0+. Deno users install from the same npm package — Deno's npm compatibility layer resolves the platform binary automatically, no separate registry needed.
MCP Setup
Add to Claude Desktop / Claude Code config:
{
"mcpServers": {
"rdk": {
"command": "rdk",
"args": ["mcp"]
}
}
}Then ask your AI: "Search Reddit for pain points about CRM tools and score which ones are worth building for."
CLI
rdk mcp # Start MCP server (stdio)
rdk status # Show tier + data source
rdk activate <key> # Activate Pro license
rdk upgrade # Show Pro features + pricingMCP Tools — Code Does Plumbing
Tools fetch raw Reddit data with objective filters only (must_contain, exclude). Subjective analysis is done by prompts, not by keyword matching.
reddit_search
Search Reddit posts with exact-phrase filtering and term exclusion. Free tier returns content; Pro tier adds real upvote scores and comment counts.
reddit_post
Fetch a single post with its comments. Use for comment archaeology — the highest-signal layer for demand analysis.
reddit_subreddit
Fetch subreddit metadata and recent posts. Survey a community before engaging.
reddit_discover
Find subreddits related to a topic or audience. Use before deep research to confirm your target users actually hang out there.
reddit_account_health (Pro)
Diagnose a Reddit account's standing with Reddit's site-wide anti-spam system before you promote anything from it. Returns a verdict (healthy / mod-heavy / shadow-filtered / platform-flagged / suspended) plus hard metrics — post removal rate split by removed_by_category=reddit vs moderator removal, comment score distribution, karma, age. Reads signals that only exist in Reddit's authenticated API, so there's no Free fallback. Run this before posting_strategy or outreach_playbook — a flagged account can't receive strategy value, since Reddit will silently filter any post you plan.
MCP Prompts — Prompts Do Thinking
Prompts encode analysis frameworks. Your AI agent executes them with full context of your product and goals. This is where the subjective judgment lives — and where RDK's methodology is.
demand_analysis
5-dimension demand scoring (pain intensity, engagement, willingness to pay, competitor weakness, recurrence) + L2–L5 demand ladder classification.
competitor_analysis
Aggregate competitor sentiment, extract weakness patterns, identify switching signals.
market_scan
Subreddit opportunity scanning: pain point identification, tool request clustering, untapped-niche flagging.
audience_discovery
Find target communities and evaluate their quality as an audience.
posting_strategy
Posting time analysis, rule compliance, Modmail outreach, anti-spam safety rules.
outreach_playbook
Target prioritization and 3-channel outreach strategy. Includes classification of posts (求助型 / 忏悔型 / 方法分享型 / 工具推荐型) and decision trees for DM vs public reply.
Compiled Pipes — Zero-Cost Analysis
Prompts are flexible but LLM-expensive. Once an analysis stabilizes, RDK compiles it into a pipe — a pre-forged DAG workflow that executes via the tap runtime with zero LLM tokens per invocation.
rdk install-pipes # one-time: register compiled pipes with tap
rdk compile market-scan --subreddit SaaS --limit 10The three levels of thinking
| Level | Mechanism | Cost per run | Latency | When to use | |-------|-----------|--------------|---------|-------------| | L1 Tools | Raw data fetch | $0 | 1–2s | Exploratory one-off queries | | L2 Prompts | LLM executes framework | ~$0.15 | 25–60s | Novel analyses you haven't compiled yet | | L3 Pipes | Pre-compiled DAG | $0 | 0.8ms – 3s | High-frequency repeat workflows |
Why this matters
Every other Reddit research tool charges for each AI analysis. RDK lets you pay the reasoning cost once (when the pipe is forged) and reuse it infinitely. A market-scan pipe running daily against 10 subreddits costs $45/month in LLM tokens via the prompt version. As a compiled pipe: $0.
Two execution paths
RDK runs compiled pipes two ways. The CLI uses the first by default; the second is a programmatic API for products that embed RDK inside their own Deno process.
- Subprocess — shells out to
tap rdk <name> --json. Universal: works for every pipe, including browser-dependent ones likemarket-scan. ~80–230ms fork overhead. - In-process — imports
runTapdirectly from@taprun/executorand runs the pipe inside the caller's Deno process. Measured at 0.8ms vs 233ms subprocess ondemo-transform= 293× speedup. Currently works for pure-transform pipes (filter/sort/limit over pre-fetched rows); browser-dependent pipes still use the subprocess path until RDK ships its own daemon-backedRpcSend.
The forged pipes ship with the RDK binary. Users install them with a single rdk install-pipes call. Current pipes:
market-scan (pipe)
Replaces the market_scan prompt. Runs reddit/sub-intel, reddit/pain-points, and reddit/hot in parallel via tap.pipe's DAG scheduler, filters for pain-point rows, sorts by engagement, returns structured data. Break-even in the file header: "one real run pays back the entire forge cost."
demo-transform (pipe)
A pure-transform dogfood pipe that proves the in-process executor path end-to-end. Takes a literal rows array, applies filter + sort + limit, returns top-N. Used by src/test/compile_test.ts to benchmark subprocess vs in-process wall-clock (the test fails if in-process isn't at least 2× faster — regression guard for the executor package).
Forging new pipes
Pipes are forged using Tap's forge lifecycle. When you identify a repeat workflow that's burning LLM tokens, compile it: the reasoning is done once, the output is deterministic forever.
Pro
Free tier uses RSS data. Pro ($9/mo) adds real upvote scores, comment counts, full comments, and subscriber counts.
rdk activate <your-license-key>Get your license: https://rdk.taprun.dev
Development
deno task dev # Run CLI in dev mode
deno task mcp # Start MCP server
deno task build # Compile to single binary
deno task test # Run tests
deno task test:e2e # Run E2E tests (hits real Reddit)Build & Publish
deno task bundle # esbuild + obfuscate → dist/rdk-bundled.js
deno task build:all # Bundle + compile for all 4 platforms
deno task npm:dry-run # Validate npm publish
deno task npm:publish # Publish all 5 npm packagesDeploy Landing Page
npx wrangler pages deploy site --project-name rdkLive at: https://rdk.taprun.dev
CI/CD
release.yml— Push tagv*→ bundle + compile 4 platforms → GitHub Releasenpm-publish.yml— Manual trigger → build + publish 5 npm packages
Limitations
- No upvote scores (Free): RSS doesn't include scores. Demand scoring uses content analysis instead.
- ~25 results per query: RSS feed limit. Use multiple targeted queries for broader coverage.
- Read-only: Intelligence tool only. No posting or commenting.
License
Proprietary. All rights reserved.
