How to Use Custom Robots.txt with Your Blog? – Blogspot/Blogger SEO

Robots.txt is a text file on the server that you can customize for search engine bots. It tells search engine bots which directories, web pages, or links should be indexed or not be indexed in search results. It means you can restrict search engine bots to crawl some directories and web pages or links of your website or blog. Now custom robots.txt is available for Blogspot. In Blogger search option is related to Labels. If you are not using labels wisely, you should disallow the crawl of the search result pages. In Blogger, by default, the search link is disallowed to crawl. In this robots.txt, you can also write the location of your sitemap file. A sitemap is a file located on a server that contains all posts’ permalinks of your website or blog. Mostly sitemap is found in XML format, i.e., sitemap.xml.

Custom Robots.txt Blogger Tutorial

Presently Blogger has completed his work on sitemap.xml. Now Blogger is reading sitemap entries through the feed. By this method, the most recent 25 posts are submitted to search engines. If you want search engine, bots, only work on the most recent 25 posts, and then you should use robots.txt type 1 given below. If you set robots.txt like this, then the Google Adsense bot can crawl the entire blog for the best Adsense performance.

Robots.txt Type 1

User-agent: Mediapartners-Google

User-agent: *
Disallow: /search
Disallow: /b
Allow: /


Note: You may alter Blogspot’s default robots.txt like above.’

Robots.txt Type 2

User-agent: Mediapartners-Google

User-agent: *
Disallow: /search
Disallow: /b
Allow: /


Note: Don’t forget to change the with your blog address or a custom domain. If you want search engine bots to crawl the most recent 500 posts, you should need to use the following robots.txt type 2. If you already have more than 500 posts on your blog, you can add one more sitemap line highlighted in red. Robots.txt

Robots.txt Type 3

User-agent: Mediapartners-Google

User-agent: *
Disallow: /search
Disallow: /b
Allow: /


Note: Don’t forget to change the with your blog address or a custom domain.

Mathematical expression for Blogger robots.txt sitemap entries:


Where m=500 and n=1, 2, 3, 4,…, n. If you have organized post labels in a good format and good experience of search engine optimization (SEO), then you can remove the following line –

Disallow: /search

Most important: if you don’t want to submit any blogger post or page to search engines, then you can add them like that: For Post, add a line like that –

Disallow: /yyyy/mm/post-name.html

For Page add a line like that –

Disallow: /p/page-name.html

Manage Blogger custom robots.txt

For this, please follow these steps carefully. Dashboard ›› Blog’s Settings ›› Search Preferences ›› Crawlers and indexing ›› Custom robots.txt ›› Edit ›› Yes To get a better understanding of this you can take reference of the image given below:

Manage Blogger Custom robots.txt

I hope you’ll benefit from this post and get a better search engine presence and ranking.


  1. what is the basic time that will Google take to consider the new updated robots.txt file

    Mohit Blogger @ The Geek Solutions

    • All Blogger blogs have sitemap.xml but when you use custom domain on Blogger all go vanish. Blogger will fix this bug soon. I have reported this problem to Blogger. If anything else please write me.

  2. i have a site
    but some post url is not showing in search engine

    and i saw report on WEBMASTER that Robot.txt files blocked 37 urls

    i want, that robot.txt work properly and never block any url.
    i don’t have any knowledge about robots.txt

    PLZ help me how to control robot.txt file

    itz to difficult to understand

    if possible please reply in hindi/hinglish

  3. HI, what if you have more than 50 posts? When I test sitemap with www_kome_cafe/feeds/posts/default?orderby=UPDATED, only 26 submitted status.
    What should I add please?

    How to remove blocked by robots txt in Blogger?

    Do you check help my blog www_kome_cafe please?

  4. Good post but I have one doubt
    I have already submitted my sitemap before 500 posts an now I have crossed 500 posts. My doubt is whether I have to submit the second sitemap or not???

    • Well, Blogspot now has own sitemap. It is an old post. If you need then can submit more than one sitemaps. If you more than 500 posts then can submit.

  5. CAN I ASK WHEN I CHECK MY SITE AT UBERSUGGEST, IT’S SAYS THAT I HAVE 1 pages have blocked meta tag robots or X-Robots-Tag HTTP header

  6. Can you tell me which XML I should code when my blog is available on google, but it’s not showing the description below? When I clicked on tell me why then it says tha robot.txt is disallowing google to read the post. Or something like that. Can you suggest me which code i should copy?

  7. I read your blog, this is so awesome, but I have a doubt if we Disallow “/blog/Xyz” page, does it disallow the whole /Blog folder?? I expect you to reply as soon as possible.

    • Perhaps not! But if you will disallow “/blog”, it will surely disallow “/blog/Xyz”.


    • Browsers, robots, spiders and any other program that retrieve information from the site can be considered as an User-agent.

  8. Very well described, thank you for this entry. I didn’t know how to set it up on my blogspot to work as it should. You helped me, I set it up as you suggested, now it should be good.

  9. Hello

    Sir. I am Ajit I am Blogger

    My site is Blocked 😭 by robot.txt.

    Your post is type 1 robot.txt file upload then

    My website indexed in search console.

    Thank you, sir

  10. Sir my new posts are not indexing till more than 1 week. When i request for inexing in search console and then do live test it shows that url is in google but again when I check it is not in google. I have make baacklinks and ping but faild. Why this problem occurs?

    • Blogger’s a few dashboard pages come under “/b” thus you may hide them from search engines.

  11. I have less than 500 posts but I use Robots.txt Type 3 method that you described. Is it ok or should I change it?

  12. What type of robots.txt should I use if my blog is new and has no post? I’m also getting error message on the sitemap section under my google search console. Please help!

  13. bhai……..maine 30 post likhe hai free domain pe blogspot wale…..aur abhi maine usi blog pe new domain name liya to kya muze custom robot . txt change krna padega??

  14. Is these is correct custom robots.txt let me know

    User-agent: *
    Disallow: /search
    Allow: /

    please help reply

  15. Thanks, Vinay bro. It was a very helpful blog especially for new bloggers, I appreciate your hard work, well-done Bro.
    You have made this tutorial very easy to understand for your readers as a blogger I am impressed by your writing skills and from your sound knowledge keep going and best of luck with your future posts.

  16. Very vital information. Thank you for sharing such valuable content. Could you please make a write-up on header tags?

  17. I have a doubt about this “Disallow: /p/page-name.html.” Should the page name of the site be written in place of “page-name”? Or need to be placed as it is?

    • “/p/page-name.html” is a permalink of a blog page. Please double-check what you are doing.

  18. It is an excellent, informative, and knowledgeable article/post that you have shared with the audience/readers. You are doing a great job of providing helpful information on your website/blog, and I appreciate your efforts.


Please enter your comment!
Please enter your name here