The Most Common Google Search Console Coverage Issues - How To Fix them? (Most Common Job Interview Question)

We already know the Common Google Search Console Coverage Issues:

But let's recap again!!!

Coverage Issues:

  • Submitted URL not found (404)
  • Submitted URL seems to be a Soft 404.
  • Submitted URL blocked by robots.
  • Indexed, though blocked by robots.
  • Redirect error.
  • Server error (5xx)
  • Text too small to read.
  • Clickable elements too close together.

1. Submitted URL not found (404)

A 404 error is an HTTP status code that means the page a user is trying to reach could not be found on the server. The page will not load for the user because it simply no longer exists—it was either removed from the website completely or moved to another location without properly redirecting to a new resource.

Fix This "Submitted URL not found (404)" Error:

To fix this error, simply update the sitemap so it only contains live URLs on the website. If you have previously included URL miss-spellings then these should be corrected instead of deleted. After updating the sitemap. you can ask for a Validation Request in the Google Search Console.

2. Submitted URL seems to be a Soft 404

“A soft 404 means that a URL on your site returns a page telling the user that the page does not exist and also a 200-level (success) code to the browser.” Basically, you've got a page on your site telling visitors that it no longer exists, but at the same time, it's telling search engines that it does exist.

Fix This "Submitted URL seems to be a Soft 404" Error:

  • Check that the page is indeed a soft 404 or a false alarm.
  • Configure your server to return the proper not found error code (404/410)
  • Improve the page and request indexing.
  • Redirect the page using a 301 redirection.
  • Keep the page on your site but de-index it from search engines.

3. Submitted URL blocked by robots

Blocked sitemap URLs are typically caused by web developers improperly configuring their robots. txt file. Whenever you're disallowing anything you need to ensure that you know what you're doing otherwise, this warning will appear and the web crawlers may no longer be able to crawl your site.

Fix This "Submitted URL blocked by robots" Error:

    If you don't want Google to index your page, you should remove the URL from your sitemap. Google will notice the changes when it visits your site again. If you don't want to wait until Google's next visit, you can also resubmit the edited sitemap in the Sitemaps report of Google Search Console.

    4. Indexed, though blocked by robots

    “Indexed, though blocked by robots. txt” indicates that Google indexed URLs even though they were blocked by your robots. txt file. Google has marked these URLs as “Valid with warning” because they're unsure whether you want to have these URLs indexed.

    Fix This "Indexed, though blocked by robots" Error:

    • In Google Search Console, export the list of URLs.
    • Go through the URLs and determine whether you want these URLs indexed or not.
    • Then, it's time to edit your robots.txt file.
    • Update and submit your robots.txt file
    • Go back to Google Search Console and click Validate fix.

    5. Redirect error

    The too many redirects error indicates that your browser is stuck in an infinite redirection loop. That means your browser is trying to visit one URL which points to another URL, which points back to the first URL, so it's stuck.

    Fix This "Redirect Error":

      • Click on the INSPECT URL.
      • Get more details about the errors.
      • Click the TEST LIVE URL.
      • Fix the error and REQUEST INDEXING.
      • Go back and click VALIDATE FIX.

      6. Server error (5xx)

      A 5xx code occurs when a server does not support the functionality required to process a visitor's request. Simply put, it means that there's an error caused by the server. In many cases, a chain of servers is handling an HTTP request, so keep in mind that it may not be your server that's causing the issue.

      Fix This "Server error (5xx)":

      • Dynamic page requests can cause excessive load times; check yours and reduce excessive page loading if needed.
      • Make sure your site's hosting server is not down, overloaded, or misconfigured.
      • Check that your site is not inadvertently blocking Google.

      7. Text too small to read.

      Text too small to read is a mobile usability error. It is displayed under the Enhancements tab in the Google Search Console. This error is found by the Google Smartphone crawler. The “text too small to read” error suggests that the font size of the page is too small for mobile browsing.

      Fix This "Text too small to read" Error:

      • Identify The Pages Causing The Error.
      • Add The Viewport Meta Tag.
      • Add The Max-Width Attribute in Every Image.
      • Set the Font Size to 22px.
      • Check Your Website Layout.
      • Validate Fix.

      8. Clickable elements too close together

      In short, the Clickable elements too close together error means the URL displaying it has touch targets like links and buttons which are too close to each other. Every time a user tries to use such a link or button, the neighboring elements get tapped as well.

      Fix This "Clickable elements too close together" Error:

        • Identify the Example URL.
        • Run a Mobile Friendly Test.
        • Optimize The Touch Target Size.
        • Set The Mobile Viewport Tag.
        • Validate Fix.

        Comments