robots.txt

When you make a copy of your site for development/testing purpose, you don’t need search engines index your site as it will cuase duplicate content.

To disable indexing of a website by search engines, create a file with name robots.txt with following content.

To Allow all robots, use

To specify sitemap

Crawl delay

Crawl-delay set number of seconds between each page request. In this cause bot wait 10 seconds before indexing next page. It is supported by Bing and Yahoo.

Only allow search engines to index Home page. Deny indexing all other pages

Need help with Linux Server or WordPress? We can help!

Leave a Reply

Your email address will not be published. Required fields are marked *