@nayan-ui/cli
v1.0.0
Published
Command Line Tool for Nayan UI.
Maintainers
Keywords
Readme
Nayan UI CLI
Command Line Interface for Nayan UI - Create projects, generate sitemaps, and manage robots.txt files.
Features
- 🚀 Project Creation - Create new projects from nayan-ui templates (Expo, Vite, Games)
- 🗺️ Sitemap Generation - Automatically crawl and generate XML sitemaps
- 🤖 Robots.txt Management - Generate and validate robots.txt files
- ✅ Validation Tools - Validate sitemaps and robots.txt files
Installation
Global Installation
npm install -g @nayan-ui/cliUsing npx (No Installation Required)
npx @nayan-ui/cli [command]
# or
npx @nayan-ui/cli@latest [command]Available Commands
nayan --help # Show all commands
nayan new # Create a new project (interactive)
nayan create # Generate sitemaps or robots.txt
nayan validate # Validate sitemaps or robots.txtUsage
Create New Project
Interactive Mode (Recommended)
npx @nayan-ui/cli new
# or
nayan newThis will prompt you to:
- Enter your project name
- Select a template (expo, games, vite)
Non-Interactive Mode
npx @nayan-ui/cli new my-app -t expo
# or
nayan new my-app -t viteAvailable Templates
| Template | Description | | -------- | --------------------------------------------- | | expo | React Native Application with Expo & Nayan UI | | games | React Native Games example project | | vite | React Application with Vite and Nayan UI |
Generate Sitemap
# Basic usage
npx @nayan-ui/cli create sitemap -w https://example.com
# With options
npx @nayan-ui/cli create sitemap -w https://example.com -d 10 -f daily -o ./sitemap.xml
# If installed globally
nayan create sitemap -w https://example.comParameters
| Name | Parameter | Default | Description | | ------------------- | ----------------- | ------------- | ----------------------------------------------------------------------- | | Website URL | --website / -w | '' | Website base URL to start crawling | | Replacement Website | --replacer / -r | '' | Replacement URL (useful for localhost to production URL replacement) | | Crawling depth | --depth / -d | 10 | How deep to crawl the website | | Change frequency | --changefreq / -f | daily | Change frequency: always, hourly, daily, weekly, monthly, yearly, never | | Output | --output / -o | ./sitemap.xml | Output path for generated sitemap |
Validate Sitemap
# Local file
npx @nayan-ui/cli validate sitemap -i ./sitemap.xml
# Remote URL
npx @nayan-ui/cli validate sitemap -i https://example.com/sitemap.xml --isremote
# If installed globally
nayan validate sitemap -i ./sitemap.xmlParameters
| Name | Parameter | Default | Description | | ------------- | ---------------- | ------------- | ------------------------------------ | | Input sitemap | --input / -i | ./sitemap.xml | Path to sitemap file or URL | | Is Remote | --isremote / -ir | false | Set to true if input is a remote URL |
Generate Robots.txt
# Basic usage
npx @nayan-ui/cli create robots -d /admin -s https://example.com/sitemap.xml
# With allowed and disallowed paths
npx @nayan-ui/cli create robots -a /home,/about -d /admin -s https://example.com/sitemap.xml -o ./robots.txt
# If installed globally
nayan create robots -d /admin -s https://example.com/sitemap.xmlParameters
| Name | Parameter | Default | Description | | ---------------- | ----------------- | ------------ | ------------------------------------------ | | Allowed paths | --allowed / -a | '' | Comma-separated paths to allow crawling | | Disallowed paths | --disallowed / -d | '' | Comma-separated paths to disallow crawling | | Sitemap | --sitemap / -s | '' | Sitemap URL for your website | | Output | --output / -o | ./robots.txt | Output path for generated robots.txt |
Validate Robots.txt
# Local file
npx @nayan-ui/cli validate robots -i ./robots.txt
# Remote URL
npx @nayan-ui/cli validate robots -i https://example.com/robots.txt --isremote
# If installed globally
nayan validate robots -i ./robots.txtParameters
| Name | Parameter | Default | Description | | ---------------- | ---------------- | ------------ | ------------------------------------ | | Input Robots.txt | --input / -i | ./robots.txt | Path to robots.txt file or URL | | Is Remote | --isremote / -ir | false | Set to true if input is a remote URL |
Requirements
- Node.js >= 18
License
MIT
Contributing
Submit issues and pull requests on GitHub.
