How to create robot txt and how to add robot txt to your blogger
How to Create and Add Robots.txt to Your Blogger Website
By Dr Saddam Kenya | Last Updated: June 2025
If you own a blog on Blogger (Blogspot), controlling how search engines crawl and index your content is very important. A robots.txt file helps you do that. It’s a small file that tells search engine bots which parts of your site they are allowed or not allowed to access.
📌 What is Robots.txt?
The robots.txt file is a text-based file placed at the root of your site that guides search engine bots like Googlebot on what to crawl and what to ignore.
For example:
User-agent: *
Disallow: /search
Allow: /
Sitemap: https://www.example.com/sitemap.xml
🔧 Step 1: Create a Robots.txt File for Blogger
You can copy and customize the following template for your Blogger blog:
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Disallow: /share-widget
Allow: /
Sitemap: https://yourblogname.blogspot.com/sitemap.xml
Sitemap: https://yourblogname.blogspot.com/sitemap-pages.xml
👉 Replace yourblogname with your actual blog name or custom domain (e.g., drsaddamkenya.site).
⚙️ Step 2: Add Robots.txt to Your Blogger Blog
Now that you have created your custom robots.txt, follow these steps to add it to Blogger:
- Go to Blogger Dashboard.
- Click on Settings from the left menu.
- Scroll down to the section labeled Crawlers and indexing.
- Enable Enable custom robots.txt.
- Click on Custom robots.txt and paste the code you created earlier.
- Click Save.
Done! Your blog now has a robots.txt file.
🔍 How to Check if Robots.txt is Working
To verify, go to your browser and type the following URL:
https://yourdomain.com/robots.txt
If it loads correctly, your robots.txt file is now live.
🤔 Why Use Robots.txt on Blogger?
- Prevent Google from indexing duplicate content like search pages.
- Improve SEO by focusing crawl budget on important pages.
- Ensure that only clean and relevant URLs appear in Google results.
✅ Pro Tip
Always test your robots.txt file with the Google Search Console Robots.txt Tester to make sure you’re not blocking important parts of your site.
💡 Conclusion
Adding a proper robots.txt file to your Blogger website is simple but powerful. It helps shape how search engines view and crawl your content. By following the steps above, your blog will be more optimized for search engines and offer better control over your indexing strategy.
If you found this guide useful, share it and bookmark Dr Saddam Kenya Blog for more Blogger tips and SEO tricks.