Hey there, it’s Martin! If you’ve recently received an alert saying Page indexing issues detected in submitted URLs from Google Search Console, you’re not alone. I know how alarming that red exclamation mark can be. But breathe easy, because today we’re going to break down exactly what you should do. And don’t worry; we’ll keep it simple!
Verify the issue
First Step: Go to your Google Search Console dashboard and click on the “Coverage” tab. Look for URLs listed under “Excluded.” Are they important pages? If yes, keep reading!
Is your page blocked by robots.txt?
First Step: Open your site’s
robots.txt file. You’ll typically find it at
https://yourwebsite.com/robots.txt. Check if the blocked URLs are mentioned there. If they are, remove those lines and save the changes.
Check your noindex tags
First Step: Open the HTML source code of the affected pages and look for the
noindex tag in the
<head> section. If you find
<meta name="robots" content="noindex">, remove it. You don’t want that there if you want Google to index the page!
Review sitemap errors
First Step: Head back to Google Search Console and check the “Sitemaps” section for any errors. If you spot any, correct the sitemap file and resubmit it.
Look for redirect or 404 issues
First Step: Run a site audit using a tool like Screaming Frog to identify 404 errors or redirects. These could be causing your indexing issues. Fix them right up!
Examine content quality
First Step: Sometimes the issue lies in thin or duplicate content. Perform a quick review of the URLs in question. If the content is sparse or looks too similar to other pages, beef it up or make it unique.
Submit for reindexing
Final Step: Once you’ve cleared these potential roadblocks, head back to Google Search Console. Use the “URL Inspection” tool to request reindexing for each corrected URL.
And there you have it! Simple steps to sort out those pesky page indexing issues. Got more questions? Feel free to reach out. Let’s make the web a friendlier place, one indexed page at a time!