A few post back I have shared a post about how to set the search preferences in blogger and in the last
of this post I had promised with you that in the upcoming posts I will share
about blogger robots.txt. So today I bring a complete tutorial about the
custom robots.txt file. Actually, this a simple text file (not
html) which is giving instruction to search engine bots that which pages either
index or not. The search engine bots and other crawling bots usually obey this
text.
This very important factor in SEO, with the help this text file, you can block your blog's any irrelevant URL such as label, demo pages any other page which is not important to index in search engines.kbcfss bjkl
User-agent: Mediapartners-Google
This very important factor in SEO, with the help this text file, you can block your blog's any irrelevant URL such as label, demo pages any other page which is not important to index in search engines.kbcfss bjkl
Default Set Up Robots.txt File for Blogger
In blogger we are using default robots.text like below.User-agent: Mediapartners-GoogleFirst, I want to explain the terms of this file because this very important to know what are you doing in your blog. If you do any mistake in adding of this file. In the result search engines can ignore your blog or reduce blog's ranking. In below I explained all terms of the robots.txt file.
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: http://example.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500
User-agent: Mediapartners-Google
Mediapartners- is the name of Google AdSense crawling bots name and this is
only beneficial for those blogs who are using Google AdSense on their blogs. It
helps to display your niche ads on your blog. If you are not using Google
AdSense in the blog, then you don't need to keep this line.
User-agent*
When you mark User-agent with asterisk (*) sign, this means you allow all crawling bots such as Google-bots, bing and other search engines and directories bots.
Disallow: /search
This code will block all URLs which having keyword search after the domain name. In blogger all labels and search pages urls having this kind structure. So, that means you block all labels and search pages from crawling.
Allow: /
When you mark User-agent with asterisk (*) sign, this means you allow all crawling bots such as Google-bots, bing and other search engines and directories bots.
Disallow: /search
This code will block all URLs which having keyword search after the domain name. In blogger all labels and search pages urls having this kind structure. So, that means you block all labels and search pages from crawling.
Allow: /
This code specially used for homepage and it invites the search engine
spider to crawl and index your blog home page.
Sitemap:
Sitemap:
Sitemap is helpful to show your content to search engines and it is
inviting the search bots to crawl your blog posts quickly and it also notifies
the search engines when you do any changing in blog.
For post
How To Disallow post Or Page Using Robots.txt File?
If you want to prevent any page or post from search engines crawling. You can do by using custom robots.txt file, simply write after User-agent* disallow: then "/" and place your post or page's except the main domain name like below.For post
Disallow : /2015/04/your-post-url.html
For page
Disallow : /p/your-page-url.html.
Go to your blogger dashboard
For page
Disallow : /p/your-page-url.html.
How to generate robots.txt file?
After reading, this this tutorial I hope you will be able to create robots.txt file. This very easy and you don't require to learn any special language or techniques. So you can create manually or use the below online tools the file.Adding Custom Robots.txt File In Blogger
Once you create robots.txt file, then you easily add in blogger.Go to your blogger dashboard
- Select Settings, then Search preferences Crawlers and indexing > Custom robots.txt > Edit > Yes.
- Paste your robots.txt code in it.
- Click Save Changes button.
A few post back I have shared a post about how to set the search preferences in blogger and in the last
of this post I had promised with you that in the upcoming posts I will share
about blogger robots.txt. So today I bring a complete tutorial about the
custom robots.txt file. Actually, this a simple text file (not
html) which is giving instruction to search engine bots that which pages either
index or not. The search engine bots and other crawling bots usually obey this
text.
This very important factor in SEO, with the help this text file, you can block your blog's any irrelevant URL such as label, demo pages any other page which is not important to index in search engines.kbcfss bjkl
User-agent: Mediapartners-Google
This very important factor in SEO, with the help this text file, you can block your blog's any irrelevant URL such as label, demo pages any other page which is not important to index in search engines.kbcfss bjkl
Default Set Up Robots.txt File for Blogger
In blogger we are using default robots.text like below.User-agent: Mediapartners-GoogleFirst, I want to explain the terms of this file because this very important to know what are you doing in your blog. If you do any mistake in adding of this file. In the result search engines can ignore your blog or reduce blog's ranking. In below I explained all terms of the robots.txt file.
Disallow:
User-agent: *
Disallow: /search
Allow: /
Sitemap: http://example.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500
User-agent: Mediapartners-Google
Mediapartners- is the name of Google AdSense crawling bots name and this is
only beneficial for those blogs who are using Google AdSense on their blogs. It
helps to display your niche ads on your blog. If you are not using Google
AdSense in the blog, then you don't need to keep this line.
User-agent*
When you mark User-agent with asterisk (*) sign, this means you allow all crawling bots such as Google-bots, bing and other search engines and directories bots.
Disallow: /search
This code will block all URLs which having keyword search after the domain name. In blogger all labels and search pages urls having this kind structure. So, that means you block all labels and search pages from crawling.
Allow: /
When you mark User-agent with asterisk (*) sign, this means you allow all crawling bots such as Google-bots, bing and other search engines and directories bots.
Disallow: /search
This code will block all URLs which having keyword search after the domain name. In blogger all labels and search pages urls having this kind structure. So, that means you block all labels and search pages from crawling.
Allow: /
This code specially used for homepage and it invites the search engine
spider to crawl and index your blog home page.
Sitemap:
Sitemap:
Sitemap is helpful to show your content to search engines and it is
inviting the search bots to crawl your blog posts quickly and it also notifies
the search engines when you do any changing in blog.
For post
How To Disallow post Or Page Using Robots.txt File?
If you want to prevent any page or post from search engines crawling. You can do by using custom robots.txt file, simply write after User-agent* disallow: then "/" and place your post or page's except the main domain name like below.For post
Disallow : /2015/04/your-post-url.html
For page
Disallow : /p/your-page-url.html.
Go to your blogger dashboard
For page
Disallow : /p/your-page-url.html.
How to generate robots.txt file?
After reading, this this tutorial I hope you will be able to create robots.txt file. This very easy and you don't require to learn any special language or techniques. So you can create manually or use the below online tools the file.Adding Custom Robots.txt File In Blogger
Once you create robots.txt file, then you easily add in blogger.Go to your blogger dashboard
- Select Settings, then Search preferences Crawlers and indexing > Custom robots.txt > Edit > Yes.
- Paste your robots.txt code in it.
- Click Save Changes button.
No Comment