Crawl-Delay in robots.txt
·
Use the crawl-delay option in your robots.txt file to specify how often our robots will retrieve pages from your website
·
A crawl-delay that is greater than 60 seconds will be ignored and the default crawl frequency of 60 seconds will be used
·
The crawl-delay must be in seconds
The following examples will set a crawl-delay of 30 seconds.
User-Agent: *
Crawl-Delay: 30
Or specific to startShoppingBot:
User-Agent: startShoppingBot
Crawl-Delay: 30