Common robots txt and console errors on blogger
Common Robots.txt Errors on Blogger and How to Fix Them
By Dr Saddam Kenya | Updated June 2025
Many Blogger (Blogspot) users struggle with search engine indexing issues. One of the main culprits? A poorly configured robots.txt
file. If you've seen messages like "Discovered – currently not indexed" or "Submitted URL blocked by robots.txt" in Google Search Console, you're not alone.
In this post, I’ll show you common mistakes people make with robots.txt in Blogger, what errors appear in Search Console, and how to fix them.
๐ซ 1. Blocking Important Pages
Many Blogger users unknowingly block useful URLs by using:
Disallow: /
This blocks everything from being indexed. It’s usually a mistake. Avoid it unless you're intentionally hiding your entire site (not recommended).
✅ Instead, allow the homepage and pages like this:
User-agent: *
Disallow: /search
Allow: /
๐ 2. “Discovered – Currently Not Indexed” in Search Console
This issue is **not directly caused by robots.txt** — but robots.txt can contribute if it blocks internal navigation pages like /search
.
Fix:
- Ensure your main posts and pages are crawlable
- Add working
sitemap.xml
entries - Don't block pages that have real content or backlinks
๐ 3. “Submitted URL Blocked by Robots.txt”
This happens when your sitemap includes URLs that your robots.txt disallows.
Example of conflicting rule:
User-agent: *
Disallow: /search
If your sitemap includes search label links like:
https://drsaddamkenya.site/search/label/Health
Google will get confused: your sitemap says “index this,” but robots.txt says “don’t crawl it.”
✅ Fix: Either remove those URLs from your sitemap, or update your robots.txt if you want them indexed.
๐ 4. Missing or Wrong Sitemap URL
A missing or wrong sitemap entry in robots.txt can reduce crawling frequency.
Bad example:
Sitemap: https://example.com/sitemap.txt
Correct example for Blogger:
Sitemap: https://www.drsaddamkenya.site/sitemap.xml
Sitemap: https://www.drsaddamkenya.site/sitemap-pages.xml
⚠️ 5. Indexing Blocked Due to “noindex” or Meta Tags
Even with a clean robots.txt, your content won’t appear in search if:
- You set custom robots header tags to “noindex” or “nofollow” under Blogger settings.
✅ Fix:
- Go to Settings → Crawlers and Indexing
- Enable Custom robots header tags
- Set homepage, archive, and post pages to:
all
,noodp
๐ How to Test Robots.txt in Search Console
To confirm your robots.txt is working:
- Go to Google Search Console
- Choose your property
- Navigate to Pages → View data about robots.txt blocking
You’ll see any pages blocked by robots.txt and recommended actions.
✅ Final Recommended Robots.txt for Blogger
User-agent: Mediapartners-Google
Disallow:
User-agent: *
Disallow: /search
Disallow: /share-widget
Allow: /
Sitemap: https://www.drsaddamkenya.site/sitemap.xml
Sitemap: https://www.drsaddamkenya.site/sitemap-pages.xml
๐ Conclusion
Robots.txt is a powerful tool. But misuse can hide your site from Google and damage your traffic. Always double-check your settings inside Blogger and in Google Search Console.
Need help? Reach out to me via the Dr Saddam Kenya blog for custom advice on robots.txt, SEO, or Blogger growth.