Generates a robots.txt
This module generates a robots.txt. The generated robots.txt conforms to the standards set by Google. Use it to programmatically generate a robots.txt file for your site.
$ npm install robotize
const robotize = require('robotize');
const opts = {
useragent: 'googlebot',
allow: ['index.html', 'about.html'],
disallow: ['404.html'],
sitemap: 'https://www.site.com/sitemap.xml'
};
robotize(opts, (err, robots) => {
if (err) {
throw new Error(err);
} else {
console.log(robots);
}
});Will log:
User-agent: googlebot
Allow: index.html
Allow: about.html
Disallow: 404.html
Sitemap: https://www.site.com/sitemap.xml
Robotize accepts an object with options. The options are:
useragent: the useragent - String, default:*allow: an array of the url(s) to allow - Array of Stringsdisallow: an array of the url(s) to disallow - Array of Stringssitemap: the sitemap url - String
Robotize expects at least one of the last three options. So either allow,
disallow or sitemap must be passed.
Forked from robots-generator.
MIT