News Roundup: Bing Launches Enhanced Robots.txt Testing Tool
Bing Kicks Off Enhanced Robots.txt Testing Tool
Making sure your website’s robots.txt file doesn’t keep search engines from accessing your website is vital to your SEO performance. Having your website’s robots.txt file set up improperly can be a deal-breaker for your website’s visibility—and therefore organic traffic.
To help you set up this crucial file correctly, Bing Webmaster Tools has now made it easier to quickly check whether URLs are accessible to Bingbot—and other robots.
According to Bing’s official statement (opens in a new tab), they have enhanced the Robots.txt Testing Tool, which lets you check file validity and highlight any issues that could keep your website from getting optimally crawled by Bing.
In addition, it lets you check the accessibility of individual URLs. To do this, you can submit it to the new Test URL box, which operates like Bingbot and BingAdsBot would. The box checks the robots.txt file and verifies if the URL has been allowed or blocked accordingly.
If there are any crawling issues with the checked URL, you can test possible solutions in the editor directly. The system will then test accessibility against the new conditions. Once the file is fixed and the URL is working as intended, you can download the changed robots.txt file directly from Bing’s Robots.txt tester and upload it to your website.
Monitor your robots.txt file for changes
As hinted above, a malfunctioning robots.txt file can seriously hit your SEO performance. Even giants can stumble—Ryanair once learned a real lesson about setting up robots.txt incorrectly.
Sometimes, it can happen that the file is updated without your knowledge. This can be done by a developer, a site owner for whom you might be working as an agency, or even a WordPress plugin update. That’s why you need to monitor your robots.txt constantly to keep your website from losing any SEO points.
Don’t let your website disappear from Google and Bing because of just one mistake. Let ContentKing watch for any sudden changes to your robots.txt file.
Googlebot is now crawling under the Chrome 85 banner
In August of this year, Google released a new version of its browser—Chrome 85. The updated Chrome brought Windows, Mac, Linux, Android, and iOS users striking PDF improvements, a revolutionary tab manager, 10% faster page loading, and a heap of developer features.
And Now, as Valentin Pletzer pointed out on Twitter on September 4, it appears Google has now begun using Chrome 85 for crawling:
Subsequently, many users have found the Chrome 85 user-agent in their logs. All suggesting that Google has been using the new version for crawling since the beginning of September 2020, or even the end of August.
This observation goes hand in hand with Google’s announcement from May 2019, that Googlebot is now evergreen (opens in a new tab).
“Googlebot now runs the latest Chromium rendering engine…Moving forward, Googlebot will regularly update its rendering engine,” Google said back then.
While some users noticed their websites being crawled with the Chrome 85 user-agent, others had an even bigger surprise. For instance, Serge Bezborodov saw that his website was being crawled with Chrome 87.
This suggests that Google has been testing the 87 version, which is not a stable release yet. Reportedly, v87 is to be deployed in a year or so.
Keep up to date!
Want to be kept up-to-date on all things SEO?
Sign up for our newsletter and follow us on social media: