Robots.txt file is crucial to index a website on search engines. It instructs how to crawl and index web pages. Blogger has a default robots.txt file and it also allows you to customize it according to your need. Using a custom robots.txt file is essential if you are serious about optimizing your site for search engines. In this tutorial, I will show you how to add a custom robots.txt file in blogger.
What is a Robots.txt File?
A robots.txt file is a simple text file located in the root directory of a site. It contains the instructions for web crawlers. These instructions guide search engine bots on how to crawl and index a website. By using a custom robots.txt file in a blogger site, you can restrict bots from accessing specific pages.
Key Components of a Robots.txt File
Understanding the following parts of a robots.txt file will help you to make a custom robots.txt file for your site.
User-agent: Media partners-Google
This line is essential if you are running Google AdSense ads on your blog. It ensures that Google can deliver relevant ads to your site. It is best to leave this line intact even if you are not using AdSense.
User-agent: *
This line marked with an asterisk (*) applies to all robots. It serves as a catch-all for general instructions.
Disallow: /search
The above “Disallow” directive prevents search engines from indexing web pages that contain search query results. It can be beneficial if you don’t want these pages to appear in search engines.
Allow: /
The above “Allow” directive instructs search engines to crawl and index your blog’s homepage.
Sitemap: https://example.com/sitemap.xml
Including your sitemap in the robots.txt file helps search crawlers to easily locate and index your blog content.
How to Add a Custom Robots.txt File in Blogger
Follow the below steps to add a custom robots.txt file to your blogger site:
1. Log in to your blogger dashboard.
2. In the left sidebar, click on the “Settings” tab.
3. Scroll down and find the “Crawlers and indexing” section.
4. Enable the “Custom Robots.txt” option.
5. After enabling the Custom robots.txt option, a text box will appear to paste your robots.txt file code.
6. Click on the “Save” to save your changes.
That’s it! You have successfully added a custom robots.txt file to your blogger blog.
How to Check the Robots.txt File in Blogger
Follow the steps below to check the robots.txt file in blogger site:
1. Open a browser such as Chrome or Firefox.
2. Enter your blog’s URL in the address bar and put “/robots.txt” after the url.
3. Press Enter to access your robots.txt file.
For example, if your blog’s URL is “http://example.blogspot.com“ then you have to type “http://example.blogspot.com/robots.txt“ in the address bar and hit enter to access the robots.txt file.
How to Disallow Specific Content
If you want to prevent some posts or pages from being indexed by search engines then follow the steps below:
Note: Be careful with Disallow directives, any misconfiguration can prevent search engines from indexing your entire site.
To disallow a specific post:
Disallow: /yyyy/mm/your-post-url.html
Replace “yyyy/mm/my-first-post.html” with the url slug of the post you want to block.
To disallow a particular page:
Disallow: /p/page-url.html
Replace “p/page-url.html” with the URL slug of the page you want to prevent from indexing.
Final Thoughts
Customizing the default blogger robots.txt file is easy. But always remember robots.txt is a very sensitive file. A properly configured robots file can improve your blog’s visibility on search engines. And any misconfiguration can block search engine crawlers from accessing your website.
WikiPoka made a robots.txt file generator for blogger sites to simplify the process of making a custom robots.txt file. So, don’t forget to try this amazing tool.
If you found this article helpful, please share it with your fellow bloggers. And if you have any questions regarding this tutorial then ask me in the comments section.