Industry News

What We Can Learn From Real World SEO Fails From 2020

  • June 23, 2020

SEO can be rather tricky. One bad click or a rogue character can ruin everything.

Learning from mistakes does wonders. That’s why we’ve covered quite a few SEO disasters in the past.

We thought it was time for another edition, so we’ve compiled the top five SEO fails that took place in 2020.

Sad robot illustration for SEO fail

Let’s see what major companies did wrong and how you can avoid such missteps.

1. LinkedIn gets removed from Google’s Index

On May 3, one of the largest social media networks out there experienced a real-world episode of Monty Python’s “How not to be seen.”

What happened?

A huge portion of LinkedIn’s content vanished from Google search. It was Barry Schwartz (opens in a new tab) who pointed out that there was something wrong with the company’s search visibility on Google at the start of May. He noticed that the site:www.linkedin.com query returned zero search results, meaning that Google had deindexed its pages in the WWW subdomain.

No results for www LinkedIn query on Google

The incident lasted a few hours and according to Barry, it affected hundreds of millions of web pages.

What did this result in?

Here’s a screenshot from Ahrefs detailing the drop in organic traffic around May 6th:

Linking organic traffic drop on Ahrefs

What could have caused this?

While the whole incident remains veiled in uncertainty, and neither LinkedIn nor Google has issued an official statement, the most likely cause of the problem is LinkedIn itself.

Once the issue got noticed, experts started speculating on what could have resulted in the disappearance.

For instance, some users thought that the website had a robots.txt directive that blocked Google’s crawlers.

We thought that this was highly unlikely, because disallowing Google’s crawlers from accessing the site wouldn't have triggered such an enormous visibility drop. On top of that, LinkedIn is still using the same directive in their robots.txt to this day, and the pages that vanished quickly returned.

We dug in a little deeper. Here’s the section of LinkedIn’s robots.txt that SEOs were speculating about:

LinkedIn robots.txt file

The reason why this couldn’t be the cause for the drop is that Googlebots won't listen to the "Disallow: /" (opens in a new tab), as they were already addressed at the top of the file. Their robots.txt addresses googlebot directly, and tells it there are no limitations: User-agent: Googlebot

The second suggestion—pointing to the most likely cause—came from Google’s John Mueller. His tweet suggested that the deindexing might have been caused by removing the http:// version of the LinkedIn website through the Google Search Console Removals Tool, which would have removed all of the other variants; HTTPS, WWW and non-WWW too.

This would mean that LinkedIn may have inadvertently removed itself from Google’s search in an attempt to canonicalize the HTTP version to the HTTPS version by removing the HTTP version through GSC.

Here’s what that looks like in GSC:

Removing HTTP version in Google Search Console Removals Tool

Be very careful when using the URL removals tool and pay close attention to warning messages that are shown. In this case, it clearly says “All URL variations (www/non-www and http/https) will be affected”.

The solution to this issue

As John Mueller pointed out, using Google Search Console’s URL removals tool for removing the HTTP version of your website is not exactly the brightest idea.

The removals tool was designed to help webmasters temporarily block search results for outdated or inappropriate content on their website. Instead, it comes in handy for website owners that need to hide URLs from Google Search quickly. It is not a good way to remove your content for good. And as Mueller said, removing the whole HTTP part of your website will also remove all other variants.

If they did use the removals tool here, LinkedIn must have quickly reverted the change through the very same tool.

To cancel a request to temporarily hide part of their website from search results, they must have taken these steps:

  1. Open the Removals tool (opens in a new tab).
  2. Find your request in the history table.
  3. Click the inline menu button next to the request and use Cancel Request.

How can you avoid this issue?

First of all, you need to use the best technique for the job. Instead of the Removals Tool, LinkedIn should have used 301 redirects to set up domain redirects correctly. Here’s what they should have done:

  • If http://linkedin.com is requested, 301 redirect to https://www.linkedin.com
  • If http://www.linkedin.com is requested, 301 redirect to https://www.linkedin.com
  • If https://linkedin.com is requested, 301 redirect to https://www.linkedin.com

Remember: every change you make to your website should be monitored. That way you can rule out potential issues if you experience an enormous drop in organic traffic.

Are your domain redirects working properly?

Run a quick check with ContentKing and find mistakes in your settings.

2. Ryanair disallowing the whole website in robots.txt

Ryanair’s web almost stopped flying in organic search due to one simple change.

What happened?

While disallowing crawlers in the robots.txt file (opens in a new tab) most likely wasn’t the true cause of LinkedIn's disappearance from Google Search, it definitely was the main trouble for the low-cost Irish carrier Ryanair.

On May 7, someone from this airline updated their robots.txt file with one simple directive. The change was spotted by Francesco Baldini (opens in a new tab), who turned to Twitter to spark a bit of discussion.

This directive (which in Ryanair’s case remained on their website for 12 days, from May 7th until May 19th) tells Google (and other search engines) to stop crawling the entire website. This doesn’t mean that Ryanair would have disappeared from the search results immediately—unlike what happened with LinkedIn—but its organic visibility would have gradually decreased over time.

What did this result in?

Luckily for Ryanair, the mistake didn’t cause any noticeable harm to their SEO visibility. According to Ahrefs, they didn’t lose any organic traffic around the time the change occurred. But this could have gone very differently had they not noticed or been notified about this issue.

Ryanair’s organic traffic

Still, having this directive remain implemented would have caused a gradual decrease of organic traffic over time.

As the Ryanair case shows, even the smallest elements of a website can produce the large damages. The robots.txt file is one of the oldest and most sensitive components of a website, that can have highly disruptive effects. Limited access for the dev and SEO team, rigid procedure and QA process should be in place to make sure these types of problem don't happen (again).

What could have caused this?

Using the Wayback Machine (opens in a new tab), we can see what the robots.txt file looked like before the change. It’s quite apparent that the change was massive.

The person who changed the file must have erased basically all of its content and rewritten the condition so as to disallow crawling completely. There is only a small chance that the update to the file was accidental.

Ryanair’s robots.txt back to normal

It’s not clear how this happened or whether this was actually done on purpose. Because Ryanair didn’t release any statements regarding the change, speculation about its possible cause immediately emerged among SEOs on Twitter.

Proposed explanations that circulated were that Ryanair may have simply wished to decrease their organic visibility to get fewer bookings—or even make things harder for customers who were seeking refunds due to flights that had been canceled because of the difficult times COVID-19 has brought upon the travel sector.

Another theory that surfaced was that a disgruntled ex-employee may have done this as revenge for being laid off.

Lots of speculation, but we’ll probably never hear the full story. What we can zoom in on, though, is how this all could have been prevented.

The solution to the issue

To solve this kind of issue, do what RyanAir did. After almost two whole weeks of having their website closed from crawlers, Ryanair reupdated the robots.txt file, restoring its former state.

How to prevent this issue

There is no doubt that you always need to be very careful when changing your robots.txt file. One small change can either help or ruin a website’s SEO performance. That’s why you need to handle any changes to this file with care and always be 100% sure of what a change will result in.

In large companies such as Ryanair, it should be always clear who has the access and power to handle robots.txt files, and all changes should go through a QA process. Even in crazy times like these.

You need to be monitoring your robots.txt. When it changes, you want to be alerted immediately. We know the pain it causes when damaging changes aren’t noticed, and the joy it brings when they’re caught before search engines even have a chance to process them.

Want alerts for robots.txt changes?

Stop always being a few steps behind. Proactively monitor your robots.txt.

3. Harley Quinn gets lost in ambiguity

It was supposed to be yet another great DC blockbuster. Instead, Warner Bros lost over $16 million just because they ignored SEO. Lesson learned—the hard way.

What happened?

The Californian motion picture giant Warner Bros. came up with a new movie based around the popular DC character Harley Quinn—girlfriend to the infamous Joker. But the company failed to place the character’s name at the beginning of the movie title; they named it “Birds of Prey: The Fantabulous Emancipation of One Harley Quinn.”

New Harley Quinn: Birds of Prey poster

What did this result in?

Even though they eventually realized that Harley Quinn was an important keyword and changed the movie’s title to “Harley Quinn: Birds of Prey,” it was too late, and the damage was already done. Without a mention of Harley Quinn at the beginning of the title, the movie wasn't ranking well, and so their audience had trouble finding it. While the movie was well received by critics, it underperformed significantly, generating a box office figure of only $33 million domestically, opposed to expectations of $50 million.

What could have caused this?

This all suggests that the reason for the initial disaster was a total neglect of SEO and other marketing activities—as outlined in an article on The Verge (opens in a new tab). The core cause of the movie being lost in searches was the absence of proper keyword research (opens in a new tab) and competitor research. The studio failed to analyze what content was already ranking for the queries they were intending to use, and to gauge their chances of outranking competitors.

The solution to the issue

…is what Warner Bros admitted they did. They eventually changed the name to boost SEO and make it easier for fans to find it online, as part of a “search expansion for ticket sites,” as The Verge reported.

How can you prevent this issue?

Hopefully, the lesson Warner Bros learned here is that—even for movies—SEO is essential. Just like other businesses launching new services or product lines, Warner Bros should have involved SEO specialists in the process. Doing so would have saved them from their own SEO fail.

Links play a huge role in a website’s SEO success.

Because of this, people will often do whatever they need to to get powerful links, including paying for them. And that’s OK, really. Google agrees; according to their guidelines, it’s fine to pay for a link, as long as that link has the nofollow link attribute (opens in a new tab) or the rel="sponsored" attribute. Doing so communicates to search engines not to let the link pass “value”, while the linked site will still benefit from the traffic that the link sends.

Many people, however, will buy links without the nofollow or sponsored link attributes and try to get away with it. Often they do, but sometimes they don’t and—when caught—may get slapped with a penalty.

What happened?

Despite search engines’ open fight against unnatural links, many websites have been advertising links and articles as a part of their business model.

Here’s a simple example of someone who’s not exactly flying under the radar:

Selling sponsored links online

And recently, one of the biggest SEO tool suites—SEMRush—appeared to have joined in on this practice of launching a marketplace for guest posts that include links.

The links sold as a part of these guest posts would be deemed “unnatural links” according to Google.

In its guidelines for large-scale article campaigns (opens in a new tab), Google mentions that it “does not discourage these types of articles in the cases when they inform users, educate another site’s audience or bring awareness to your cause or company. However, what does violate Google's guidelines on link schemes (opens in a new tab) is when the main intent is to build links in a large-scale way back to the author’s site.”

Here’s what Google’s John Mueller said about it:

What did this result in?

It was obvious that SEMRush had spent a lot of resources on building their marketplace, yet they took down their guest post offering nonetheless. But it was too late. Its launch caused a lot of backlash on social media in the SEO scene, with even Google’s Jon Mueller weighing in. In an intense PR response, SEMrush’s CEO said (opens in a new tab) that the new marketplace feature was not about selling links, but about traffic-generating guest posts, and that they communicated this poorly. However, as we’ve mentioned, after the comments on social media, SEMRush took the guest post offering down from their marketplace.

The solution to the issue and how to prevent it in the future

When coming up with a new feature like this, the company should have investigated its possible consequences. They should have checked out whether the new product would go against Google’s official guidelines.

5. Marine Le Pen losing to electronic appliances

In a recent case study, Antoine Eripret (opens in a new tab) describes an affiliate website that took advantage of a major French politician’s expired domain. Such domains often retain value from their past, which is why many, many SEOs are always on the lookout for websites that aren’t maintained, and domains that may not be renewed or have already expired.

What happened?

For her ambitions of becoming the president of France in 2017, the far-right politician Marine Le Pen used the domain marine2017.fr. After she lost the election to Emmanuel Macron, the domain was left unused, because it served no purpose anymore. Eventually, it expired.

However, the expired domain was actually still carrying a lot of value. Its biggest benefit was that during the presidential race, it had generated tons of authority-bringing links from various media outlets. Marine Le Pen was very widely covered by almost every publisher in France, as well as in foreign countries.

An expired domain od Marine Le Pen gets registered as an affiliate website

This didn’t go unnoticed, and someone registered the domain in 2018 to use it as an affiliate comparison web offering electronic appliances such as fridges and meat grinders. Even today in 2020, the domain still ranks very well thanks to all the links it gathered in 2017.

According to Ahrefs, the domain has 81.3k monthly traffic, 365k backlinks, and 1.59k referring domains. Its overall domain rating is 79 out of 100. What can we say except: “Good catch, madame ou monsieur!”

An expired domain od Marine Le Pen still has real value

The solution to the issue and how to prevent it in the future

The fact that the domain has a huge number of backlinks suggests that Marine Le Pen’s marketers didn’t think things through and missed a huge opportunity.

The domain could have been reused to support Marine Le Pen in other political attempts, either as a redirect or to hold more up-to-date content highlighting her current activities. The lesson of this situation is that you should never get rid of powerful old domains, as they’re a highly valuable asset.

It also shows how profitable a domain hunting business can turn out to be—which explains why so many people are active in this sector.

A domain is a valuable asset; don't let yours expire and always be on the lookout for buying good, expired domains that you've come across. There are countless examples where a business was built on top of an expired domain. It's a common practice, especially in the affiliate game.

Closing thoughts

As you can see in these five examples above, no one is safe from the risk of SEO disasters—not even the biggest companies in the world. But looking at the mistakes and learning how they could have been prevented is a great way to cut down on your own future failures.

Keep in mind that even a seemingly minor change to your website can have harsh consequences. If you’re struggling with any SEO-related issues, you can always seek answers in our Academy (opens in a new tab) section, where we strive to cover every aspect of this super-tricky field.

Steven is ContentKing's VP of Community. This means he's involved in everything community and content marketing related. Right where he wants to be. He gets a huge kick out of letting websites rank and loves to talk SEO, content marketing and growth.