Connect with us

Hi, what are you looking for?

Tech News

Setup a Custom Robots.txt File for Blogger/Blogspot

Setup a Custom Robots.txt File for Blogger/Blogspot
Setup a Custom Robots.txt File for Blogger/Blogspot

Learn how to create a custom robots.txt file for you Blogger/Blogspot blog. You can easily set robots.txt to best settings for getting great SEO results in Blogger/Blogspot.

Bloginstall.com

If you have read my earlier post then I hope you guys are aware about importance of robots.txt in search rankings or SEO.

Add Robot.txt to Blogger or Blogspot

Today, I am back with a very useful and must aware blogging term that is Robots.txt.

What is Robots.txt?

Robots.txt is a text file which contains few lines of simple code.

It is saved on the website or blog’s server which instruct the web crawlers on how to index and crawl your blog in the search results.

That means you can restrict any web page on your blog from web crawlers so that it can’t get indexed in search engines like your blog labels page, your demo page or any other pages that are not as important to get indexed.

Always remember that search crawlers scan the robots.txt file before crawling any web page.

Each blog hosted on blogger has its default robots.txt file which is something look like this: 

User-agent: Mediapartners-Google

Disallow:

User-agent: *

Disallow: /search

Allow: /

Sitemap: http://example.blogspot.com/feeds/posts/default?orderby=UPDATED

Also Learn: HTML, JavaScript, & Bootstrap – Certification Course[100%OFF]

Explanation 

This code is divided into three sections. Let’s first study each of them after that we will learn how to add custom robots.txt file in blogspot blogs.

User-agent: Mediapartners-Google 

This code is for Google Adsense robots which help them to serve better ads on your blog. Either you are using Google Adsense on your blog or not simply leave it as it is.

User-agent: * 

This is for all robots marked with asterisk (*). In default settings our blog’s labels links are restricted to indexed by search crawlers that means the web crawlers will not index our labels page links because of below code.

Disallow: /search 

That means the links having keyword search just after the domain name will be ignored. See below example which is a link of label page named SEO.

And if we remove Disallow: /search from the above code then crawlers will access our entire blog to index and crawl all of its content and web pages.

Here Allow: / refers to the Homepage that means web crawlers can crawl and index our blog’s homepage.

Disallow Particular Post

Now suppose if we want to exclude a particular post from indexing then we can add below lines in the code.

Disallow: /yyyy/mm/post-url.html

Here yyyy and mm refers to the publishing year and month of the post respectively. For example if we have published a post in year 2013 in month of March then we have to use below format.

Disallow: /2013/03/post-url.html

To make this task easy, you can simply copy the post URL and remove the blog name from the beginning.

Disallow Particular Page

If we need to disallow a particular page then we can use the same method as above. Simply copy the page URL and remove blog address from it which will something look like this:

Disallow: /p/page-url.html

Sitemap: http://example.blogspot.com/feeds/posts/default?orderby=UPDATED 

This code refers to the sitemap of our blog. By adding sitemap link here we are simply optimizing our blog’s crawling rate.

Means whenever the web crawlers scan our robots.txt file they will find a path to our sitemap where all the links of our published posts present.

Web crawlers will find it easy to crawl all of our posts.

Hence, there are better chances that web crawlers crawl all of our blog posts without ignoring a single one.

Note: This sitemap will only tell the web crawlers about the recent 25 posts. If you want to increase the number of link in your sitemap then replace default sitemap with below one. It will work for first 500 recent posts.

Sitemap: http://example.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500

If you have more than 500 published posts in your blog then you can use two sitemaps like below:

Sitemap: http://example.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500Sitemap: http://example.blogspot.com/atom.xml?redirect=false&start-index=500&max-results=1000

Why Robots.txt File is Important?

Well, the success of any professional blogs usually depends on how Google search engine ranks your blog. We store a number of posts/pages/files/directories in our website structure. Often we don’t want Google to index all these components. For example, you may have a file for internal use — and it is of no use for the search engines. You don’t want this file to appear in search results. Therefore, it is prudent to hide such files from search engines.

Robots.txt file contains directives which all the top search engines honor. Using these directives you can give instructions to web spiders to ignore certain portions of your website/blog.

Custom Robots.txt for Blogger/Blogspot

Because Blogger/Blogspot is a free blogging service, robots.txt of your blog was not directly in your control. But now Blogger has made it possible to make changes and create a Custom Robots.txt for each blog. Robots.txt for a Blogger/Blogspot blog looks typically like this:

1User-agent: Mediapartners-Google
2Disallow:
3User-agent: *
4Disallow: /search
5Allow: /
6Sitemap: http://example.blogspot.com/feeds/posts/default?orderby=UPDATED

Add Custom Robots.txt File on Blogger/Blogspot

  • Go to your blogger dashboard
  • Open Settings > Search Preferences > Crawlers and indexing > Custom robots.txt > Edit > Yes
  • Here you can make changes in the robots.txt file
  • After making changes, click Save Changes button
screenshot of how to add custom robots.txt in blogger/blogspot

View the Existing Custom Robots.txt File

In order to view the existing custom robots.txt for your blog, go the to following URL:

http://www.yourblog.blogspot.com/robots.txt

example Link: https://premiumids.blogspot.com/robots.txt

Needless to say, please replace yourblog with the name of your blog.

Sitemap

Sitemap is a very important file in your website/blog. This file contains the structure of your website. It helps web crawlers to find their way through your blog. The Sitemap: directive tells the crawler the location of your sitemap file. In case of Blogger/Blogspot — you can leave this line as it is.

This is it! Should you have any questions regarding custom robots.txt file for Blogger/Blogspot, do let me know in the comments section. I will try my best to assist you. Thank you for connecting

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Premium Account

Zee5 Premium Account ID and Password Free 2023:Are you searching Zee5 Premium Account Free WORKING Account? Zee5 is a popular and on-demand Indian Video Streaming...

Entertainment

[ad_1] AP CM Jagan led YSRCP government has hiked the value added tax (VAT) on cooking gas by ten percent. Earlier the tax was...

Books

Antenna & Wave Propagation By KD Prasad ***CONTENTS****Chapter 6: Antenna TerminologyAnd Hertzian Dipole, Half Wave DipoleMore Chapters Coming Soon …DOWNLOAD Chapter:6(15MB)

Entertainment

[ad_1] Movie4Me 2021 is a pirated website. The Movie 4 Mi website allows its users to download the latest Bollywood, Hollywood movies for free...