Use robots.txt to Increase Search Engine Rankings
According to a Link Building Blog post by Neil Patel, he uses a robots.txt file to remove junk pages and duplicate content from the search engine index databases. As a result, his website traffic went up 11.3%!
Here are the things he did with his website:
- Removed comment feeds from search results so that no duplicate comment text indexed.
- Removed trackback URLs from being indexed because it was causing blank pages to be indexed.
- Denied search bots access to his blog installation folder (MovableType).
Here is Neil Patel’s robots.txt file content:
User-agent: *
Disallow: /mt
Disallow: /*.xml$
Disallow: /*.cgi$
A simple file with four like of codes! However, it is for MovableType blog only. For WordPress users, askApache prepared the robots.txt files for WordPress and phpBB users. Great!
My two cents
robots.txt file is a plain text file that control the search bots access to your website. I wonder why SEO gurus seldom talk about it. I will try this tip and see the result. :)
But, please be careful when you apply robots.txt. A wrong line of code can block ALL search bots from indexing your website!
Interesting! I will try this also. I will be glad if you could post your results after some time – ok?
how is the result?
my blog feed url has lower rank now. Previously in 1st page, now in 2nd page.
would you mind to post the robots.txt that you use?
does it bring more traffic to your website?
There is a new and updated robots.txt example at http://www.askapache.com/seo/wordpress-robotstxt-optimized-for-seo.html