Monday, December 30, 2013

Create and Add custom robots.txt file in Blogger

In this tutorial I am going to show you how to enable and add Blogger blog robots.txt file for better search engine optimization. Robots.txt is a simple plain text file and this file is used in every blog that control is webmaster’s robots. Search crawlers scan or check the robots.txt file before crawling web page which can really help you to index your blog. Customization of robot.txt is important to tell crawler to index your blog content in search engine and it is helpful for SEO. You will see two options Custom robots.txt and Custom robots header tags.


 How to setup and enable custom robots.txt file
Step 1 : Go to blogger dashboard>>Setting>>Search Preference
Step 2 : Click on ‘custom robot.txt’
Step 3 : Now click Edit and Select Yes.
Step 4 : Copy and Paste the following code in the box provided.
Step 5 : Now click on Save Changes
Step 6 : Only replace “yourblogname.com” with your blog’s homepage URL

Mediapartners-Google – Media partner Google is the user agent for Google AdSense that is used to server better relevant ads on your site based on your content. So if you disallow this they you will won’t able to see any ads on your blocked pages.

User-agent:*: User-agent is calling the robot. * is for all the search engine’s robots like Google, Yahoo and Bing etc.

Disallow: This line tells the search engine’s crawler not to crawl the search pages.

Allow – Allow: / simply refers to or you are specifically allowing search engines to crawl those pages.

0 comments:

Post a Comment