The message “Submitted URL seems to be a Soft 404” indicates that you submitted a URL through your XML sitemap, and Google consider it to be a so-called “soft 404”.
What’s a soft 404 error?
A soft 404 error is not a status code your web server returns to Google, it’s a label that Google applies to a page.
This is what the Google Search Console’s Index Coverage report on soft 404 errors looks like:
What causes soft 404 errors and how to fix them?
There are many reasons why Google may consider a page to be a soft 404. In this section, we’ll cover the most common ones.
404 page incorrectly returns HTTP status 200
A 404 page is shown, but the page is still returning the HTTP status code 200 OK.
Fix: make sure to return this status code instead of a
200 OK. By doing so, you’ll send Google the right signal and the page will be removed from their index, and the issue in Google Search Console will automatically close. Make sure these URLs are also removed from the XML sitemap.
Redirect target is not relevant enough
You’re redirecting a URL to another URL which isn’t relevant enough according to Google, which occurs often on eCommerce sites when discontinued products are redirected to a product category or the home page.
Fix: update the redirect so it points to a most relevant alternative.
Page has little to no content.
Google would expect pages with little to no content to return a HTTP status code 404 Not Found. Take for example:
- Empty search result pages
- Empty product category pages or product detail pages
- Empty blog category pages.
Page content contains 404-like phrases
The page’s content contains phrases which you’d typically expect on 404 pages, such as for example: “no longer available”, “item not available”, “not in stock” and “does not exist”.
Google can surprisingly easily mistake a page for being 404 page.
Fix: adjust the page’s content so it’s mistaken for a 404 page.
Accidentally blocked Google
Fix: investigate how Google is blocked, and prevent it from happening. For instance, it’s easy to prevent Google from accessing your assets with an incorrect directive in your robots.txt file.