How to Set Custom Robots.txt That's Right

How to Set Custom Robots.txt That's Right. Buddy Blogger, Robots.txt is containing guidelines or settings for Robots about how they will crawler and index your website or blog, so bloggers provide custom settings to manage some very basic and very simple SEO settings from dashboard of your blog. One of them is to set a custom robots.txt for your blog as needed buddy.

When a search engine crawler robot visits a page or website and a blog, then the first thing seen is a robots.txt file. for Blogger platform users, now you can or have the option to control search engine crawler bots and decide which ones to crawl and index from your website or blog.

Basically every blog from platfrom blogger has robots.txt default setting, but with new changes in this blogger you can change it as needed, nah .. for that in this post you will know about how to default robots.txt suits and how to add or edit custom robots.txt for your blog.

User-agent: Mediapartners-Google
User-agent: *
Disallow: /search
Allow: /

The default Robot.txt setting above is the same for every blog. but if you feel this setting is able to meet the needs of SEO for your blog and blog already has a lot of visitors, then you do not need to replace it if this is not necessary, because robots.txt suits will greatly affect your blog, but of course you also have know if in this setting, there may be something you need and this is important for your blog.

To check your custom Robots.txt blog, simply add robots.txt behind your blog's url in the browser and click enter.

How To Setup Or Custom Custom Robots.txt In Its Additions

1. Go to Dashboard> Settings> Search Preferences> Crawls and Indexing.
2. Click 'Custom robots.txt'> Click 'Yes'.
3. Paste the custom robots.txt you have specified. (See Example Picture Below)
4. Click 'Save changes' to save your custom robots.txt.

That's how to insert or edit a custom robots.txt in dashboar. Now let's see how to write a custom robots.txt for good bloggers but this should also be customized to your needs. at these points you just have to install the code as you need it.

Custom robots.txt right in Blog with his needs.

User-agent: Mediapartners-Google
User-agent: *
Disallow: /search
Disallow: /p/*
Disallow: /view/*
Disallow: /?m=1
Disallow: /?m=0
Disallow: /*?m=1
Disallow: /*?m=0
Allow: /

The example above is a custom robots.txt for blogs as needed, how to unimportant and important ones should be indexed and blocked. Now I will explain what each of the above lines of each code is applied and what additional meanings are given in this costume.

User-agent: Mediapartners-Google
Disallow: This line indicates that the bot crawler is in command to crawl or to crawler adsense. if you already have Adsense ads; this line is great for, to crawler adsense and to visit all the pages of your site in accordance with the adsense guidelines, for this line better in let.

User-agent: * This line is used for bots to crawler all the sites of your blog page (except for separate settings below it, whether it should be indexed or blocked in the addition of code below it, code that you apply.

Disallow: / search This line indicates that every page / search in URL form will not be crawled and indexed by bot crawler. This line is very important to make SEO your blog, such as will not crawl or will not index the Label and Archive page. Because the page is not a unique page or URL that can be indexed and also to avoid duplicate content on your blog.

Disallow: / p / * This line serves to block the robot crawl the page for your blog. But if you want your page to remain indexed by crawler, please delete this line.

Disallow: / view / * This line serves to stop robots from crawling blogger pages with dynamic link views. If you use a dynamic display on your blog, then you can delete this line but if you do not use it please leave it alone.

Disallow: /? M = 1
Disallow: /? M = 0
Disallow: / *? M = 1
Disallow: / *? M = 0 These lines serve to stop robots from crawling redirect pages to mobile phones or mobile phone views. If you do not use it then you may see a blog with a link in the mobile phone view * * M = 1 or * M = 0 line is required to avoid duplicate page problems. If you do not have this problem or if you do not need it, you can delete it.

 Read: Impact Using Robots.txt Disallow M = 1 & M = 0

This line basically shows a link to your blog sitemap.
please change http://www.mycurcol/ become your blog URL.

Check and Analyze your Robot.txt

There are many tools available on the web to check your robots.txt. But to check how a google robot (ie.Adsbot-Google, Mediapartners-Google, Googlebot) will crawl on your blog, you should use google-webmaster tools. please go to webmaster dashboard> Crawler> URL blocked. You can see how this robot works on your blog in accordance with the robots.txt you specify.

Note: in the use of this robots.txt, you should really use the code you need only, because in this robots.txt implementation if you only just apply without knowing the functions of each code, it will be fatal, even impact the severity of the bot crwaler will not index your blog, so better learn first of each code and its functions.
If there are other ideas that you think are better or some are in question please discuss in the comment field. okay buddy blogger I think that is enough so first in the tutorial How to Custom Set Robots.txt That's right in my opinion :). more and my lack of apologies. Good luck