Skip to content

shevky/plugin-robots-txt

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Shevky Plugin: Robots.txt

A simple Shevky plugin that generates robots.txt. It runs on the dist:clean hook, uses allow/disallow lists from the config, and adds a Sitemap line based on the site root URL.

Features

  • Automatically generates robots.txt
  • Reads Allow and Disallow rules from config
  • Writes Sitemap as <site-url>/sitemap.xml

Installation

npm i shevky-robots-txt

Usage

The example config below uses identity.url, robots.allow, and robots.disallow:

{
  "identity": {
    "url": "https://example.com",
  },
  "robots": {
    "allow": ["/", "/blog/"],
    "disallow": ["/admin/", "/private/"],
  },
  "plugins": [
    "shevky-robots-txt",
  ],
}

Example generated robots.txt output:

User-agent: *
Allow: /
Allow: /blog/
Disallow: /admin/
Disallow: /private/

Sitemap: https://example.com/sitemap.xml

License

MIT

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors