Google on Effect of Low Quality Pages on Sitewide Rankings


In a Google Webmaster Hangout, someone asked if poor quality pages of a site could drag down the rankings of the entire site. Google’s John Mueller’s answer gave insights into how Google judges and ranks web pages and sites.

Do a Few Pages Drag Down the Entire Site?

The question asked if a section of a site could drag down the rest of the site.

The question:

“I’m curious if content is judged on a page level per the keyword or the site as a whole. Only a sub-section of the site is buying guides and they’re all under their specific URL structure.

Would Google penalize everything under that URL holistically? Do a few bad apples drag down the average?”

Difference Between Not Ranking and Penalization

John Mueller started off by correcting a perception about getting penalized that was inherent in the question. Web publishers sometimes complain about being penalized when in fact they are not. What’s happening is that their page is not ranking.

There is a difference between Google looking at your page and deciding not to rank it.

When a page fails to rank, it’s generally because the content is not good enough (a quality issue) or the content is not relevant to the search query (relevance being to the user). That’s a failure to rank, not a penalization.

A common example is the so-called Duplicate Content Penalty. There is no such penalty. It’s an inability to rank caused by content quality.

Another example is the Content Cannibalization Penalty, which is another so-called penalty that is not a penalty.

Both relate to an inability to rank because of specific content issues, but they are not penalties.  The solutions to both involve identifying the cause and fixing it, just like any other failure to rank issue.

A penalty is something completely different in that it is a result of a blatant violation of Google’s guidelines.

John Mueller Defines a Penalty

Google’s Mueller began his answer by first defining what a penalty is:

“Usually the word penalty is associated with manual actions. And if there were a manual action, like if someone manually looked at your website and said this is not a good website then you would have a notification in Search console.

So I suspect that’s not the case…”

How Google Defines Page-Level Quality

Google’s John Mueller appeared to say that Google tries to focus on page quality instead of overall site quality, when it comes to ranking. But he also said this isn’t possible with every website.

Here is what John said:

“In general when it comes to quality of a website we try to be as fine grained as possible to figure out which specific pages or parts of the website are seen as being really good and which parts are kind of maybe not so good.

And depending on the website, sometimes that’s possible. Sometimes that’s not possible. We just have to look at everything overall.”

Why Do Some Sites Get Away with Low Quality Pages?

John’s answer is interesting. But it also leads to another question. Why do some sites get away with low quality sections while others cannot?

I suspect, and this is just a guess, that it may be a matter of the density of the low quality noise within the site.

For example, a site might be comprised of high quality web pages but feature a section that contains thin content. In that case, because the thin content is just a single section, it might not interfere with the ability of the pages on the rest of the site from ranking.

In a different scenario, if a site mostly contains low quality web pages, the good quality pages may have a hard time gaining traction through internal linking and the flow of PageRank through the site. The low quality pages could theoretically hinder a high quality page’s ability to acquire the signals necessary for Google to understand the page.

Here is where John described a site that may be unable to rank a high quality page because Google couldn’t get past all the low quality signals.

Here’s what John said:

“So it might be that we found a part of your website where we say we’re not so sure about the quality of this part of the website because there’s some really good stuff here. But there’s also some really shady or iffy stuff here as well… and we don’t know like how we should treat things over all. That might be the case.”

Effect of Low Quality Signals Sitewide

John Mueller offered an interesting insight into how low quality on-page signals could interfere with the ability of high quality pages to rank. Of equal interest he also suggested that in some cases the negative signals might not interfere with the ability of high quality pages to rank.

So if I were to put an idea from this exchange and put it in a bag to take away with me, I’d select the idea that a site with mostly low quality content is going to have a harder time trying to rank a high quality page.

And similarly, a site with mostly high quality content is going to be able to rise above some low quality content that might be separated into it’s own little section. It is of course a good idea to minimize low quality signals as much as you can.

Watch the Webmaster Hangout here.

More Resources

Screenshots by Author, Modified by Author





Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com