Use robots.txt to Increase Search Engine Rankings

According to a Link Building Blog post by Neil Patel, he uses a robots.txt file to remove junk pages and duplicate content from the search engine index databases. As a result, his website traffic went up 11.3%!

Here are the things he did with his website:

  • Removed comment feeds from search results so that no duplicate comment text indexed.
  • Removed trackback URLs from being indexed because it was causing blank pages to be indexed.
  • Denied search bots access to his blog installation folder (MovableType).

Here is Neil Patel’s robots.txt file content:

User-agent: *
Disallow: /mt
Disallow: /*.xml$
Disallow: /*.cgi$

A simple file with four like of codes! However, it is for MovableType blog only. For WordPress users, askApache prepared the robots.txt files for WordPress and phpBB users. Great!

My two cents

robots.txt file is a plain text file that control the search bots access to your website. I wonder why SEO gurus seldom talk about it. I will try this tip and see the result. :)

But, please be careful when you apply robots.txt. A wrong line of code can block ALL search bots from indexing your website!

Neil Patel, robots, search engine, seo, tips

Similar Posts

5 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *