I found a major issue when I upload robots.txt on my website.
My last blog robots.txt was a little bit famous as I get good visitors. If you tried to put robots.txt to your site, You will get an email from Google search console coverage issues detected on “domain.com”
Related content: Create a perfect robots.txt file for your website
If you’ll get an email like this. Go to your Google search console page and you will get a warning like this:
It means your blog category or tag is blocked by robots.txt.
How to fix this issue:
There are many ways to get out of this problem.
If you want to fix this issue in your robots.txt, open your robots.txt editor and add noindex to the category and the same as to tag also.
It means you will allow the Search engine crawler to “noindex” to you any of your categories & tag.
You need to remodify your XML sitemap or you need to regenerate your XML sitemap file.
When you disable your category and tag in robots.txt, the crawler finds the same inside your sitemap. Where you already add the category and tag inside your XML sitemap.
Open XML sitemap in Notepad. Try to delete all the categories and tag inside your XML sitemap.
How to fix inside search console:
After removing all the part you need to click on FIX this issue or you can again submit your XML sitemap URL to the google sitemap followed by this URL https://www.domain.com/sitemap.xml
If you are an expert on search console and fixing the issue inside your website, it’s going to be an easy task for you. You can manage it from your end. If you do still not fix this issue, you can comment here. I’ll try to fix your issue.
You will find a success email from google or it will be fixed in 30 days.