Robots.txt File Issues and How to Fix Them

By January 11, 2016 SEO No Comments

Unfortunately, in the world of search engine optimization, there are a number of problems that can arise without you even noticing until it is too late. This is especially true with robots.txt files as they oftentimes will slowly eat away at your website traffic and rankings until your major webpages are no longer showing up in Google search results.

The gradual decline in traffic and rankings will usually go unnoticed by webmasters until it is too late and the damage is done. If you notice a drop in your website traffic or Google rankings, it may be worth your time to double check your robots.txt file using some of the suggestions listed below.

Robots.txt File Issues

First, you should go to your Google Search Console and use the robots.txt tester to see if any URLs are being blocked. You can also compile a list of URLs that were previously indexed by Google and crawl them to see if any additional URLs are being wrongfully blocked.

robots-site-command

Once you have identified any blocked URLs it is time to identify what the problem is and fix it. In most cases, it is usually one of two issues related to directives. First, you need to check the case of your directives, as they are case-sensitive and can cause problems if they are not in the appropriate case (lower case or upper case). Second, make sure your CMS provider is not adding directives without your knowledge.

robots-tester-blocked

Again, even seemingly small alterations in your directives can cause big setbacks for your website. Either of these problems can cause your site to no longer be crawled by Google, resulting in a drop in rankings and website traffic. My best advice for site owners is to learn how to identify these potential issues before they cause any serious damage to your website.

robots-case-sensitive

The most important thing you can do to prevent robots.txt issues is to frequently crawl and audit your site. This will allow you to identify any URLs that are being blocked. It is also equally important that you are regularly auditing your Google Search Console as doing so will help you find any problems before they start negatively affecting your website.