"Indexed, though blocked by robots.txt": what does it mean and how to fix?
âIndexed, though blocked by robots.txtâ indicates that Google indexed URLs even though they were blocked by your robots.txt file.
Google has marked these URLs as âValid with warningâ because theyâre unsure whether you want to have these URLs indexed. In this article youâll learn how to fix this issue.
Hereâs what this looks like in Google Search Consoleâs Index Coverage report, with the amount of URL impressions shown:

Double-check on URL level
You can double-check this by going to Coverage
> Indexed, though blocked by robots.txt
and inspect one of the URLs listed. Then under Crawl
itâll say âNo: blocked by robots.txtâ for the field Crawl allowed
and âFailed: Blocked by robots.txtâ for the field Page fetch
.
So what happened?
Normally, Google wouldnât have indexed these URLs but apparently they found links to them and deemed them important enough to be indexed.
Itâs likely that the snippets that are shown are suboptimal, such as for instance:

How to fix âIndexed, though blocked by robots.txtâ
Indexed, though blocked by robots.txt fix for WordPress
The process to fixing this issue for WordPress sites is the same as described in the steps above, but here are some pointers to quickly find your robots.txt file in WordPress:
WordPress + Yoast SEO
If youâre using the Yoast SEO plugin, follow the steps below to adjust your robots.txt file:
WordPress + Rank Math
If youâre using the Rank Math SEO plugin, follow the steps below to adjust your robots.txt file:
WordPress + All in One SEO
If youâre using the All in One SEO plugin, follow the steps below to adjust your robots.txt file:
Indexed, though blocked by robots.txt fix for Shopify
Shopify doesnât allow you to manage your robots.txt from their system, so youâre working with a default one thatâs applied to all sites.
Perhaps youâve seen the âIndexed, though blocked by robots.txtâ message in Google Search Console or received a âNew index coverage issue detectedâ email from Google about it. We recommended to always check out what URLs this concerns, because you donât want to leave anything to chance in SEO.
Review the URLs, and see if any important URLs are blocked. If thatâs the case, youâve got two options which require some work, but do allow you to change your robots.txt file on Shopify:
Whether or not these options are worth it to you depends on the potential reward. If itâs sizable, look into implementing one of these options.
You can take the same approach on the Squarespace platform.
FAQs
đ€ Why is Google showing this error for my pages?
Google found links to pages that arenât accessible to them due to robots.txt disallow directives. When Google deems these pages important enough, theyâll index them.
đ§ How do you fix this error?
The short answer to that, is by making sure pages that you want Google to index should just be accessible to Googleâs crawlers. And pages that you donât want them to index, shouldnât be linked internally. The long answer is described in the section âHow to fix âIndexed, though blocked by robots.txtââ of this article.
đ§Ÿ Can I edit my robots.txt file on WordPress?
Popular SEO plugins such as Yoast, Rank Math and All in one SEO for example allow you to edit your robots.txt directly from the wp-admin panel.