How to create robot txt and how to add robot txt to your blogger

How to Create and Add Robots.txt to Your Blogger Website | Dr Saddam Kenya

How to Create and Add Robots.txt to Your Blogger Website

By Dr Saddam Kenya | June 2025

When managing a blog on Blogger, it's important to control what content gets indexed by search engines like Google. This is where a robots.txt file comes in handy. It tells search engine crawlers which pages or folders they should or shouldn’t access. In this guide, I’ll show you how to create your own robots.txt file and add it to your Blogger blog easily.

๐Ÿ“˜ What is a Robots.txt File?

The robots.txt file is a simple text file that provides instructions to web crawlers. It can help prevent search engines from crawling unnecessary parts of your site such as /search or internal labels, while still allowing important pages to be indexed.

Here’s a simple example:

User-agent: *
Disallow: /search
Allow: /

✏️ Step 1: Create a Custom Robots.txt File for Blogger

Use the sample code below and modify it based on your site structure. I’ve customized this example for drsaddamkenya.site:

User-agent: Mediapartners-Google
Disallow:

User-agent: *
Disallow: /search
Disallow: /share-widget
Allow: /

Sitemap: https://www.drsaddamkenya.site/sitemap.xml
Sitemap: https://www.drsaddamkenya.site/sitemap-pages.xml

Tip: The Disallow rules help hide dynamic and duplicate content. The Sitemap lines help Google find your content more efficiently.

๐Ÿ› ️ Step 2: Add Robots.txt to Blogger

  1. Log in to your Blogger Dashboard.
  2. Click on Settings from the left menu.
  3. Scroll down to Crawlers and Indexing.
  4. Turn on Enable custom robots.txt.
  5. Click on the link Custom robots.txt that appears.
  6. Paste the code you copied above into the box.
  7. Click Save.

That’s it! You’ve successfully added a robots.txt file to your Blogger blog.

๐Ÿ”Ž Step 3: Confirm It's Working

To check if your robots.txt is live, go to your browser and type:

https://www.drsaddamkenya.site/robots.txt

You should see the file you created displayed on that page.

๐Ÿ’ก Why Robots.txt Is Important

  • Improves your blog’s SEO.
  • Blocks irrelevant or low-quality pages from being indexed.
  • Ensures only your best content appears in Google search results.
  • Helps save crawl budget for larger sites.

๐Ÿ“Œ Final Advice

Use robots.txt wisely. Don’t block pages unless you’re sure they shouldn't be indexed. If you make a mistake, important content might disappear from search engines.

For best results, pair this with Google Search Console’s tools to monitor how Google is crawling your Blogger site.

๐Ÿ”š Conclusion

Now you know how to create and add a custom robots.txt file to your Blogger blog. This small change can help improve how search engines treat your blog and boost your visibility online.

Stay tuned for more tutorials on SEO, Blogger, and online success strategies right here on Dr Saddam Kenya Blog.

Popular posts from this blog

How to file KRA Returns on ecitizen . How to file nil returns on Ecitizen

Why kipchumba murkomen - shoot to kill orders are illegal and dangerous

Pregnancy Symptoms and Signs After One Day and One Month