docusaurus-plugin-multi-rss
v1.0.1
Published
A powerful Docusaurus plugin for aggregating and displaying multiple RSS feeds with category filtering, batch processing, and TypeScript support
Maintainers
Readme
docusaurus-plugin-multi-rss
A powerful Docusaurus plugin for aggregating and displaying multiple RSS feeds with category filtering, batch processing, and full TypeScript support.
Features
- Multi-Feed Aggregation: Fetch and combine multiple RSS feeds from different sources
- Category Organization: Organize feeds by categories (cyber, osint, ai, tech, etc.)
- Batch Processing: Concurrent feed fetching with configurable batch sizes
- Smart Caching: Generates optimized JSON files at build time for fast client-side access
- TypeScript Support: Full type definitions included
- Error Handling: Graceful failure handling with detailed error reporting
- Customizable: Flexible configuration options for timeouts, concurrency, and item limits
- Statistics: Built-in feed statistics and health monitoring
- Security: Built-in URL sanitization to prevent XSS attacks from malicious RSS feeds
Installation
npm install docusaurus-plugin-multi-rss
# or
yarn add docusaurus-plugin-multi-rss
# or
pnpm add docusaurus-plugin-multi-rssQuick Start
1. Create Feed Configuration
Create a rss-feeds.config.ts file in your project root:
export interface RSSFeedConfig {
url: string;
category: string;
title: string;
}
export const rssFeeds = {
'krebs-security': {
url: 'https://krebsonsecurity.com/feed/',
category: 'cyber',
title: 'Krebs on Security'
},
'hacker-news': {
url: 'https://hnrss.org/frontpage',
category: 'tech',
title: 'Hacker News'
},
// Add more feeds...
};
export const rssPluginOptions = {
maxItemsPerFeed: 20,
concurrency: 4,
enableSeparateFiles: true,
timeout: 15000,
};2. Configure Docusaurus
Add the plugin to your docusaurus.config.ts:
import { rssFeeds, rssPluginOptions } from './rss-feeds.config';
const config = {
// ... other config
plugins: [
[
'docusaurus-plugin-multi-rss',
{
...rssPluginOptions,
feeds: rssFeeds
}
]
]
};
export default config;3. Access RSS Data
The plugin generates several JSON files in .docusaurus/docusaurus-plugin-multi-rss/default/:
rss-data.json- Complete RSS data with all feedsfeed-{feedKey}.json- Individual feed datacategory-{category}.json- Category-grouped itemslatest-items.json- 50 most recent items across all feedsrss-stats.json- Feed statistics and health status
Usage in Components
import React from 'react';
function MyRSSPage() {
// Import generated RSS data
const rssData = require('@site/.docusaurus/docusaurus-plugin-multi-rss/default/rss-data.json');
return (
<div>
<h1>Latest News</h1>
{rssData.allItems.slice(0, 10).map((item) => (
<article key={item.guid}>
<h2>
{item.link ? (
<a href={item.link} target="_blank" rel="noopener noreferrer">
{item.title}
</a>
) : (
item.title
)}
</h2>
<p>{item.summary}</p>
<small>
{item.feedTitle} • {new Date(item.publishedDate).toLocaleDateString()}
</small>
</article>
))}
</div>
);
}
export default MyRSSPage;Configuration Options
Plugin Options
| Option | Type | Default | Description |
|--------|------|---------|-------------|
| feeds | Record<string, FeedConfig> | {} | RSS feed definitions |
| maxItemsPerFeed | number | 20 | Maximum items to fetch per feed |
| concurrency | number | 5 | Number of feeds to fetch concurrently |
| enableSeparateFiles | boolean | true | Generate separate JSON files for feeds/categories |
| timeout | number | 10000 | Request timeout in milliseconds |
Feed Configuration
Each feed can be configured with:
{
url: string; // RSS feed URL (required)
category?: string; // Category for organization (default: 'general')
title?: string; // Custom title override (defaults to feed's title)
}Generated Data Structure
RSSData
interface RSSData {
feeds: Record<string, ProcessedFeed>; // All feeds by key
categories: Record<string, RSSItem[]>; // Items grouped by category
allItems: RSSItem[]; // All items sorted by date
lastUpdated: string; // ISO timestamp
stats: {
totalFeeds: number;
successfulFeeds: number;
failedFeeds: number;
totalItems: number;
categoryCounts: Record<string, number>;
};
}RSSItem
interface RSSItem {
title?: string;
link?: string;
pubDate?: string;
description?: string;
content?: string;
author?: string;
categories?: string[];
// Enhanced fields added by plugin
feedKey?: string;
feedTitle?: string;
category?: string;
cleanTitle?: string;
publishedDate?: Date;
summary?: string;
}Examples
Basic Example
See the examples/basic directory for a complete working example.
Filtering by Category
// Load only cyber security feeds
const cyberFeeds = require('@site/.docusaurus/docusaurus-plugin-multi-rss/default/category-cyber.json');
function CyberNews() {
return (
<div>
<h1>Cybersecurity News</h1>
{cyberFeeds.items.map(item => (
<article key={item.guid}>
<h2>{item.title}</h2>
<p>{item.summary}</p>
</article>
))}
</div>
);
}Display Latest Items
const latestItems = require('@site/.docusaurus/docusaurus-plugin-multi-rss/default/latest-items.json');
function LatestNews() {
return (
<div>
<h1>Latest from All Feeds</h1>
{latestItems.map(item => (
<article key={item.guid}>
<h2><a href={item.link}>{item.title}</a></h2>
<small>{item.feedTitle} • {item.category}</small>
</article>
))}
</div>
);
}Feed Statistics
const stats = require('@site/.docusaurus/docusaurus-plugin-multi-rss/default/rss-stats.json');
function FeedStats() {
return (
<div>
<h2>Feed Statistics</h2>
<p>Total Feeds: {stats.totalFeeds}</p>
<p>Successful: {stats.successfulFeeds}</p>
<p>Failed: {stats.failedFeeds}</p>
<p>Total Items: {stats.totalItems}</p>
<h3>By Category</h3>
<ul>
{Object.entries(stats.categoryCounts).map(([cat, count]) => (
<li key={cat}>{cat}: {count} items</li>
))}
</ul>
</div>
);
}Keeping Feeds Fresh
This plugin fetches RSS feeds at build time. To keep content fresh, you have several options:
Option 1: Scheduled Rebuilds with GitHub Actions (Recommended)
Create .github/workflows/update-rss-feeds.yml:
name: Update RSS Feeds
on:
schedule:
- cron: '0 7 * * *'
# Allow manual triggering from Actions tab
workflow_dispatch:
# Also run on push to main (for immediate updates after config changes)
push:
branches:
- main
paths:
- 'rss-feeds.config.ts'
- 'docusaurus.config.ts'
- '.github/workflows/update-rss-feeds.yml'
jobs:
update-and-deploy:
runs-on: ubuntu-latest
permissions:
contents: write
pages: write
id-token: write
environment:
name: github-pages
url: ${{ steps.deployment.outputs.page_url }}
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
fetch-depth: 0 # Full history for git operations
- name: Checkout Intel Codex Vault
uses: actions/checkout@v4
with:
repository: gl0bal01/intel-codex
path: .temp-vault
fetch-depth: 1
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
cache: 'npm'
- name: Install dependencies
run: npm ci
- name: Build site (fetches fresh RSS feeds)
run: npm run build
env:
NODE_ENV: production
- name: Upload artifact
uses: actions/upload-pages-artifact@v3
with:
path: ./build
- name: Deploy to GitHub Pages
id: deployment
uses: actions/deploy-pages@v4
- name: Notify on failure
if: failure()
run: |
echo "RSS feed update failed at $(date)"
# Optional: Add notification service here (Discord webhook, email, etc.)Pros: Simple, works with GitHub Pages, no backend needed Cons: Limited to hourly updates, uses GitHub Actions minutes
Option 2: Client-Side Refresh
Add a refresh button using RSS proxy services:
const [items, setItems] = useState(buildTimeData);
const refreshFeeds = async () => {
const proxy = 'https://api.rss2json.com/v1/api.json?rss_url=';
const response = await fetch(proxy + encodeURIComponent(feedUrl));
const data = await response.json();
setItems(data.items);
};See docs/client-side-fetching.md for more details.
Option 3: Serverless API
Deploy a serverless function (Vercel/Netlify/Cloudflare) for real-time updates.
See docs/server-side-api.md for implementation guide.
Security
This plugin includes built-in security measures to protect against XSS attacks from malicious RSS feeds:
URL Sanitization
All URLs from RSS feeds (item links, feed links, and enclosure URLs) are automatically sanitized at build time to prevent XSS attacks. The sanitization:
- Blocks dangerous protocols:
javascript:,data:,vbscript:,file:,about: - Allows safe protocols:
http:,https:,mailto:,ftp: - Logs warnings: When dangerous URLs are detected and blocked
Blocked URLs are set to undefined, so always check if a link exists before rendering:
{item.link ? (
<a href={item.link} target="_blank" rel="noopener noreferrer">
{item.title}
</a>
) : (
<span>{item.title}</span>
)}Additional Client-Side Protection (Recommended)
For defense-in-depth, consider adding client-side URL sanitization as well. See the example file for a complete implementation with both backend and frontend sanitization.
Security Best Practices
- Always use
rel="noopener noreferrer"when rendering external links withtarget="_blank" - Validate link existence before rendering anchor tags
- Monitor console logs for warnings about blocked URLs
- Review feed sources regularly to ensure they're trustworthy
Best Practices
- Externalize Feed Configuration: Keep feeds in a separate config file for easier maintenance
- Use Categories: Organize feeds by topic for better user experience
- Adjust Concurrency: Lower concurrency if you have many feeds to avoid rate limiting
- Monitor Stats: Use the stats file to track feed health and performance
- Handle Errors: Check feed status before rendering to handle failed feeds gracefully
- Schedule Rebuilds: Set up automated rebuilds for fresh content (every 4-6 hours recommended)
TypeScript
This plugin is written in TypeScript and includes full type definitions. Import types:
import type {
RSSData,
RSSItem,
ProcessedFeed,
FeedConfig,
PluginOptions
} from 'docusaurus-plugin-multi-rss';Troubleshooting
Feeds Not Updating
- Clear Docusaurus cache:
npm run clear - Rebuild:
npm run build - Check console for feed fetch errors
CORS Issues
This plugin fetches feeds at build time, not in the browser, so CORS is not an issue.
Timeout Errors
Increase the timeout option if feeds are slow to respond:
{
timeout: 30000 // 30 seconds
}Rate Limiting
Lower the concurrency option to fetch fewer feeds simultaneously:
{
concurrency: 2 // Fetch 2 feeds at a time
}Contributing
Contributions are welcome! Please:
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
License
MIT License - see LICENSE file for details
Demo
- The plugin can be seen in operation on the website: gl0bal01.com
Acknowledgments
- Built for the Docusaurus community
- Uses rss-parser for RSS parsing
Related
⭐ Star this repo if you find it helpful.
