@shevky/plugin-robots-txt
v0.0.3
Published
Shevky plugin that generates robots.txt during builds.
Maintainers
Readme
Shevky Plugin: Robots.txt
A simple Shevky plugin that generates robots.txt. It runs on the dist:clean hook, uses allow/disallow lists from the config, and adds a Sitemap line based on the site root URL.
Features
- Automatically generates
robots.txt - Reads
AllowandDisallowrules from config - Writes
Sitemapas<site-url>/sitemap.xml
Installation
npm i shevky-robots-txtUsage
The example config below uses identity.url, robots.allow, and robots.disallow:
{
"identity": {
"url": "https://example.com",
},
"robots": {
"allow": ["/", "/blog/"],
"disallow": ["/admin/", "/private/"],
},
"plugins": [
"shevky-robots-txt",
],
}Example generated robots.txt output:
User-agent: *
Allow: /
Allow: /blog/
Disallow: /admin/
Disallow: /private/
Sitemap: https://example.com/sitemap.xmlLicense
MIT
