How to create robot txt and how to add robot txt to your blogger
How to Create and Add Robots.txt to Your Blogger Website
By Dr Saddam Kenya | June 2025
When managing a blog on Blogger, it's important to control what content gets indexed by search engines like Google. This is where a robots.txt
file comes in handy. It tells search engine crawlers which pages or folders they should or shouldn’t access. In this guide, I’ll show you how to create your own robots.txt file and add it to your Blogger blog easily.
๐ What is a Robots.txt File?
The robots.txt
file is a simple text file that provides instructions to web crawlers. It can help prevent search engines from crawling unnecessary parts of your site such as /search
or internal labels, while still allowing important pages to be indexed.
Here’s a simple example:
User-agent: *
Disallow: /search
Allow: /
✏️ Step 1: Create a Custom Robots.txt File for Blogger
Use the sample code below and modify it based on your site structure. I’ve customized this example for drsaddamkenya.site:
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Disallow: /share-widget
Allow: /
Sitemap: https://www.drsaddamkenya.site/sitemap.xml
Sitemap: https://www.drsaddamkenya.site/sitemap-pages.xml
Tip: The Disallow rules help hide dynamic and duplicate content. The Sitemap lines help Google find your content more efficiently.
๐ ️ Step 2: Add Robots.txt to Blogger
- Log in to your Blogger Dashboard.
- Click on Settings from the left menu.
- Scroll down to Crawlers and Indexing.
- Turn on Enable custom robots.txt.
- Click on the link Custom robots.txt that appears.
- Paste the code you copied above into the box.
- Click Save.
That’s it! You’ve successfully added a robots.txt file to your Blogger blog.
๐ Step 3: Confirm It's Working
To check if your robots.txt is live, go to your browser and type:
https://www.drsaddamkenya.site/robots.txt
You should see the file you created displayed on that page.
๐ก Why Robots.txt Is Important
- Improves your blog’s SEO.
- Blocks irrelevant or low-quality pages from being indexed.
- Ensures only your best content appears in Google search results.
- Helps save crawl budget for larger sites.
๐ Final Advice
Use robots.txt wisely. Don’t block pages unless you’re sure they shouldn't be indexed. If you make a mistake, important content might disappear from search engines.
For best results, pair this with Google Search Console’s tools to monitor how Google is crawling your Blogger site.
๐ Conclusion
Now you know how to create and add a custom robots.txt file to your Blogger blog. This small change can help improve how search engines treat your blog and boost your visibility online.
Stay tuned for more tutorials on SEO, Blogger, and online success strategies right here on Dr Saddam Kenya Blog.