What does crawl-delay: 10 mean in robots.txt?

Last updated: August 6, 2021

The crawl-delay directive is an unofficial directive meant to communicate to crawlers to slow down crrawling in order not to overload the web server.

Some search engines don't support the crawl-delay directive, and the way the crawl-delay directive is interpreted varies across search engines.

How does Google interpret crawl-delay: 10?

Google doesn't support the crawl-delay directive, so her crawlers will just ignore it.

If you want to ask Google to crawl slower, you need to set the Crawl rate in Google Search Console:

  1. Log onto the old Google Search Console (opens in a new tab).
  2. Choose the website you want to define the crawl rate for.
  3. There's only one setting you can tweak: Crawl rate, with a slider where you can set the preferred crawl rate. By default the crawl rate is set to "Let Google optimize for my site (recommended)".
Does your site attempt to slow Google down using the crawl-delay directive?

Audit your robots.txt, and find out right away!

This is what that looks like in Google Search Console:

Define Crawl Rate in Google Search Console

Bing and Yahoo and crawl-delay: 10

Bing and Yahoo support the crawl-delay directive, and in case of crawl-delay: 10 they'll divide a day into 10 second windows and within each window they'll crawl a maximum of one page.

Read more about Bing's interpretation of the crawl-delay directive (opens in a new tab).

Yandex and crawl-delay: 10

Yandex supports the crawl-delay directive, and in case of crawl-delay: 10 they'll wait a minimum of 10 seconds before requesting another URL.

While Yandex supports this directive, they recommend using Yandex.Webmaster — their version of Google Search Console to define the crawl speed settings (opens in a new tab).

Read more about Yandex' interpretation of the crawl-delay directive (opens in a new tab).

Baidu and crawl-delay: 10

Baidu does not support (opens in a new tab) the crawl-delay directive, so — similar to Google — they'll ignore the crawl-delay directive. You can define your prefered crawl frequency through Baidu Webmaster Tools.

Read the full Academy article to learn everything about Robots.txt