How to Set Up Custom Robot.txt Blogger SEO Friendly

How to Set Up Custom Robot.txt Blogger SEO Friendly
How to setting txt robot on blogspot - I used to often hear but do not understand what term is txt robot? what is the function of the txt robot? and do we need txt? From the above terms, I tried to try to learn and understand about txt robot, now I now understand how important the setting of txt robot. Are you the same as me?


After yesterday I posted about how to set the header tag for blogger, this time I returned to post how to setup a special robot.txt blogger. Maybe you are accustomed to using the default robot.txt blogger or do you never change robot.txt more SEO friendly ?. By replacing the default robots.txt blogger, you can specify which search engines can index and which ones you do not allow.

In robot.txt blogger is familiar with robot.txt custom, well on this occasion I will discuss what use robot.txt and what are the benefits of robot.txt. All you need to know, I'm not an expert in this field, but I'm learning from here, and what if you want to know about the overall usefulness of robot.txt, you can read it on Moz.com.

The thing you need to look at about robot.txt is "Use it with care. Improper use of this feature may result in your blog being ignored by search engines ". Well what if you are afraid or less know, you can follow the tutorial how to setting the special robots.txt blogger below. Please proceed to read it.

What is Robot.txt?
Robot.txt is a provision of blog users to prevent search engines from being allowed to be indexed. Robot.txt is also useful as a control of pages that should not be indexed from search engines and social media sites such as: Facebook, Twitter and others. Or you want more details and details you can read from the Wikipedia site in the Indonesian language.
How to Setup Robot.txt on Blogger
All Blogger plaforms already have robots.txt inside. By default robot.txt in blogspot as below:
User-agent: Mediapartners-Google

Disallow:

User-agent: *

Disallow: / search

Allow: /

Sitemap: http://compyku.blogspot.com/feeds/posts/default?orderby=UPDATED

Let's discuss one by one from the code above.

1.User-agent: Mediapartners-Google: User agent owned by google signifies that this blog partner from google. This code is also enabled for Google Adsense robots that help them to show relevant ads on your blog according to your blog niche.

2.Disallow: What is not allowed does not exist.

3. User-agent: All search engine robots / search engines.

4. Disallow: / search: Not allowed to crawl the seach folder etc, like search / label and so on. That means the link has a search keyword after the domain name will be ignored. as I said search / label / seo will not be in the index.

5. Allow: /: Allow all pages to be crawled, except those banned above. The "/" sign refers to the homepage and means the robot can crawl our blog homepage.

6.Sitemap: http://compyku.blogspot.com/feeds/posts/default?orderby=UPDATED: Sitemap or blog feeds address. This code refers to a set of blogs, which will make it easier for robots to crawl and index every article published.

Note: The above sitemap can only tell web crawlers about 25 newly published articles and if you want to optimize for more than that, you can turn it into a sitemap as follows:

Sitemap: http://compyku.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500

The above sitemap to let you know that your article is in the 500 range, if you have more than 500 articles you can use the sitemap below:

Sitemap: http://compyku.blogspot.com/atom.xml?redirect=false&start-index=500&max-results=1000

Well above is an explanation of the robot.txt code. If you want to modify the txt robot, let me emphasize to be careful. At the moment the sitemap I use on this site is as follows: If you want to try as I use please use this code.

User-agent: Mediapartners-Google

Disallow:

User-agent: *

Disallow: /p/about.html

Allow: /

Sitemap: https://www.buatblog.net/feeds/posts/default?orderby=updated

I use the above robot.txt code because it keeps no errors on my site, so I chose the simple one. Well for the sitemap I use it because the article on this site is still less than 40 articles, then I have not used more than 500.

Well the code below is to prevent duplicate content contained in your blog, it could be due in access from mobile phone. You can prevent it by setting robot.txt on your blogger with the following code:

User-agent: Mediapartners-Google

Disallow:

User-agent: Googlebot

Disallow: / search

Disallow: /? M = 1

Disallow: /? M = 0

Disallow: / *? M = 1

Disallow: / *? M = 0

User-agent: *

Disallow: / search

Sitemap: http://compyku.blogspot.com/feeds/posts/default?orderby=UPDATED

Adding Custom Robot.txt to your Blogspot / Blogger
Now the main part of this tutorial is how to add custom robot.txt in blogger. Here are the steps for setting robot.txt on your blogspot.
Log in to your blogger blog.
Navigate to Settings >> Shearch Preferences >> Crawlers and Indexing >> Custom robots.txt >> Edit >> Yes
Now paste your robots.txt file code in the box. You can take one of the robot.txt examples above.
Click the Save Changes button.
And finish!
You can check your robot settings, your txt by going to google google webmasters tool then click Crawl >> Robot.txt Tester. See the picture below:

robot.txt tester

End of me: I am trying to explain how to setup robots.txt in blogger very carefully so that no fatal error occurs to the reader and if anyone understands correctly with robot.txt, you can tell me my mistake by way of komntar in below this. thanks.

No comments: