@turingnova/robots
v1.0.21
Published
Next.js robots.tsx generator - Automatically create and serve robots.txt for Next.js applications
Maintainers
Readme
🤖 @turingnova/robots
Next.js robots.tsx generator - Automatically create and serve robots.txt for Next.js applications.
🚀 Quick Start
1. Install
npm install @turingnova/robots2. Initialize
npx init robots3. Deploy
npm run build
npm run deploy4. Access
https://your-domain.com/robots.txt📋 Usage
Basic Setup
# Install package
npm install @turingnova/robots
# Create robots.tsx
npx init robots
# Deploy your app
npm run build && npm run deployCustom Directory
# Create in specific directory
npx init robots --dir app
npx init robots --dir pagesCustom Configuration
# Generate with custom settings
npx init robots generate --sitemap https://yourdomain.com/sitemap.xml --output app/robots.tsx🏗️ How It Works
- Creates
robots.tsxinapp/orpages/directory - Next.js serves it automatically at
/robots.txt - Search engines can access it at
your-domain.com/robots.txt
📁 File Structure
your-nextjs-app/
├── app/
│ └── robots.tsx ← Created by npx robots init
├── pages/
│ └── robots.tsx ← Alternative location
├── src/
│ ├── app/
│ │ └── robots.tsx ← Also supported
│ └── pages/
│ └── robots.tsx ← Also supported
└── package.json🔧 CLI Commands
# Initialize robots.tsx (auto-detects directory)
npx init robots
# Generate custom robots.tsx
npx init robots generate --format tsx --output app/robots.tsx
# Generate robots.txt
npx init robots generate --format txt --output public/robots.txt⚙️ Configuration
robots.tsx Example
import { MetadataRoute } from "next";
export default function robots(): MetadataRoute.Robots {
return {
rules: {
userAgent: "*",
allow: ["/"],
disallow: ["/admin/", "/private/"],
},
sitemap: "https://yourdomain.com/sitemap.xml",
};
}📦 Installation
npm install @turingnova/robots🚀 Deployment
- Install the package
- Run
npx robots init - Deploy your Next.js app
- Access robots.txt at
your-domain.com/robots.txt
📄 License
MIT License
