7 SEO Crawling Tool Warnings & Errors You Can Safely Ignore

In many circumstances, what an SEO crawler marks as a deadly error wants quick consideration – however generally, it’s not an error in any respect.This can occur even with the most well-liked SEO crawling instruments akin to Semrush Site Audit, Ahrefs Site Audit, Sitebulb, and Screaming Frog.How are you able to inform the distinction to keep away from prioritizing a repair that doesn’t must be performed?Here are a number of real-life examples of such warnings and errors collectively, with explanations as to why they might be a problem in your web site.1. Indexability Issues (Noindex Pages on the Site)Any SEO crawler will spotlight and warn you about non-indexable pages on the location. Depending on the crawler sort, noindex pages might be marked as warnings, errors, or insights.Here’s how this situation is marked in Ahrefs Site Audit:(*7*) from Ahrefs Site Audit, September 2021The Google Search Console Coverage report may mark non-indexable pages as Errors (if the location has non-indexable pages within the sitemap submitted) or Excluded despite the fact that they don’t seem to be precise points.AdvertisementContinue Reading BeneathThis is, once more, solely the knowledge that these URLs can’t be listed.Here is what it seems like in GSC:(*7*) from Google Search Console, September 2021The reality {that a} URL has a “noindex” tag on it doesn’t essentially imply that that is an error. It solely signifies that the web page can’t be listed by Google and different search engines like google and yahoo.The “noindex” tag is certainly one of two attainable directives for crawlers, the opposite one being to index the web page.AdvertisementContinue Reading BelowPractically each web site accommodates URLs that shouldn’t be listed by Google.These might embody, for instance, tag pages (and generally class pages as effectively), login pages, password reset pages, or a thanks web page.Your process, as an SEO skilled, is to overview noindex pages on the location and resolve whether or not they certainly ought to be blocked from indexing or whether or not the “noindex” tag might have been added accidentally.2. Meta Description Too Short or EmptySEO crawlers may also verify the meta components of the location, together with meta description components. If the location doesn’t have meta descriptions or they’re too quick (normally under 110 characters), then the crawler will mark it as a problem.Here’s what that appears like in Ahrefs:(*7*) from Ahrefs Site Audit, September 2021Here is how Screaming Frog shows it:(*7*) from Screaming Frog, September 2021Depending on the dimensions of the location, it’s not at all times attainable and/or doable to create distinctive meta descriptions for all its webpages. You might not want them, both.A superb instance of a website the place it might not make sense is a large ecommerce website with tens of millions of URLs.In reality, the larger the location is, the much less essential this component will get.The content material of the meta description component, in distinction to the content material of the title tag, shouldn’t be taken into consideration by Google and doesn’t affect rankings.Search snippets generally use the meta description however are sometimes rewritten by Google.Here is what Google has to say about it of their Advanced SEO documentation:“Snippets are robotically created from web page content material. Snippets are designed to emphasise and preview the web page content material that finest pertains to a consumer’s particular search: which means that a web page would possibly present totally different snippets for various searches.”What you as an SEO have to do is remember that every website is totally different. Use your widespread SEO sense when deciding whether or not meta descriptions are certainly a problem for that particular web site, or that you would be able to safely ignore the warning.AdvertisementContinue Reading Below3. Meta Keywords LackingMeta key phrases had been used 20+ years in the past as a technique to point out to search engines like google and yahoo akin to Altavista what key phrases a given URL wished to rank for.This was, nonetheless, closely abused. Meta key phrases had been a form of a “spam magnet,” so nearly all of search engines like google and yahoo dropped assist for this component.Screaming Frog at all times checks if there are meta key phrases on the location, by default.Since that is an out of date SEO component, 99% of web sites don’t use meta key phrases anymore.Here’s what it seems like in Screaming Frog:(*7*) from Screaming Frog, September 2021New SEO execs or shoppers might get confused pondering that if a crawler marks one thing as lacking, then this component ought to really be added to the location. But that’s not the case right here!AdvertisementContinue Reading BeneathIf meta key phrases are lacking on the location you’re auditing, it’s a waste to suggest including them.4. Images Over 100 KBIt’s essential to optimize and compress pictures used on the location so {that a} gigantic PNG brand that weighs 10 MB doesn’t must be loaded on each webpage.However, it’s not at all times attainable to compress all pictures to under 100 KB.Screaming Frog will at all times spotlight and warn you about pictures which might be over 100 KB. This is what it seems like within the instrument:(*7*) from Screaming Frog, September 2021The proven fact that the location has pictures which might be over 100 KB doesn’t essentially imply that the location has points with picture optimization or may be very gradual.AdvertisementContinue Reading BeneathWhen you see this error, ensure that to verify the general website’s velocity and efficiency in Google PageSpeed Insights and the Google Search Console Core Web Vitals report.If the location is doing okay and passes the Core Web Vitals evaluation, then normally there isn’t a have to compress the pictures additional.Tip: What you could do with this Screaming Frog report is kind the pictures by dimension from the heaviest to the lightest to verify if there are some actually enormous pictures on particular webpages.5. Low Content or Low Word Count PagesDepending on the settings of the SEO crawler, most SEO auditing instruments will spotlight pages which might be under 50-100 phrases as low content material pages.Here is what this situation seems like in Ahrefs:(*7*) from Ahrefs Site Audit, September 2021Screaming Frog, however, considers pages under 200 phrases to be low content material pages by default (you possibly can change that setting upon configuring the crawl).AdvertisementContinue Reading BeneathHere is how Screaming Frog studies on that:(*7*) from Screaming Frog, September 2021Just as a result of a webpage has few phrases doesn’t imply that it is a matter or error.There are many sorts of pages that are supposed to have a low phrase depend, together with some login pages, password reset pages, tag pages, or a contact web page.The crawler will mark these pages as low content material however this isn’t a problem that can forestall the location from rating effectively in Google.AdvertisementContinue Reading BeneathWhat the instrument is attempting to let you know is that if you need a given webpage to rank extremely in Google and produce loads of natural visitors, then this webpage might must be fairly detailed and in-depth.This typically consists of, amongst others, a excessive phrase depend. But there are several types of search intents and the content material depth shouldn’t be at all times what customers are on the lookout for to fulfill their wants.When reviewing low phrase depend pages flagged by the crawler, at all times take into consideration whether or not these pages are actually meant to have loads of content material. In many circumstances, they don’t seem to be.6. Low HTML-Text RatioSemrush Site Audit may also warn you concerning the pages which have a low text-HTML ratio.This is how Semrush studies on that:(*7*) from Semrush Site Audit, September 2021This alert is meant to point out you:AdvertisementContinue Reading BeneathPages that will have a low phrase depend.Pages which might be doubtlessly inbuilt a posh means and have an enormous HTML code file.This warning typically confuses much less skilled or new SEO professionals, and you could want an skilled technical SEO professional to find out whether or not it’s one thing to fret about.There are many variables that may have an effect on the HTML-text ratio and it’s not at all times a problem if the location has a low/excessive HTML-text ratio. There is not any such factor as an optimum HTML-text ratio.What you as an SEO professional might concentrate on as an alternative is guaranteeing that the location’s velocity and efficiency are optimum.7. XML Sitemap Not Indicated in robots.txtRobots.txt, along with being the file with crawler directives, can be the place the place you possibly can specify the URL of the XML sitemap in order that Google can crawl it and index the content material simply.SEO crawlers akin to Semrush Site Audit will notify you if the XML sitemap shouldn’t be indicated in robots.txt.AdvertisementContinue Reading BeneathThis is how Semrush studies on that:(*7*) from Semrush Site Audit, September 2021At a look, this seems like a severe situation despite the fact that normally it isn’t as a result of:Google normally doesn’t have issues crawling and indexing smaller websites (under 10,000 pages).Google won’t have issues crawling and indexing enormous websites if they’ve a very good inside linking construction.An XML sitemap doesn’t must be indicated in robots.txt if it’s accurately submitted in Google Search Console.An XML sitemap doesn’t must be indicated in robots.txt if it’s in the usual location – i.e., /sitemap.xml (normally).Before you mark this as a high-priority situation in your SEO audit, ensure that not one of the above is true for the location you’re auditing.Bonus: The Tool Reports a Critical Error That Relates to Few Unimportant URLsEven if the instrument is exhibiting an actual situation, akin to a 404 web page on the location, it is probably not a severe situation if one out of tens of millions of webpages on the location return standing 404 or if there are not any hyperlinks pointing to that 404 web page.AdvertisementContinue Reading BeneathThat’s why, when assessing the problems detected by the crawler, you must at all times verify what number of webpages they relate to and which of them.You want to provide the error context.Sitebulb, for instance, will present you the proportion of URLs {that a} given error pertains to.Here is an instance of an inside URL redirecting to a damaged URL returning 4XX or 5XX reported by Sitebulb:(*7*) from Sitebulb Website Crawler, September 2021It seems like a reasonably severe situation nevertheless it solely pertains to one unimportant webpage, so it’s undoubtedly not a high-priority situation.AdvertisementContinue Reading BelowFinal Thoughts & TipsSEO crawlers are indispensable instruments for technical SEO professionals. However, what they reveal should at all times be interpreted inside the context of the web site and your objectives for the enterprise.It takes time and expertise to have the ability to inform the distinction between a pseudo-issue and an actual one. Fortunately, most crawlers supply intensive explanations of the errors and warnings they show.That’s why it’s at all times a good suggestion – particularly for newbie SEO professionals – to learn these explanations and the crawler documentation. Make certain you actually perceive what a given situation means and whether or not it’s certainly price escalating to a repair.More Resources:Featured picture: Pro Symbols/Shutterstock


Recommended For You