SMX Advanced Overtime: Your questions answered about webspam and penalties

SMX Advanced Overtime: Your questions answered about webspam and penalties


Frédéric Dubut and Fili Wiese speaking at SMX Advanced in Seattle in June. This session was so popular they will be teaming up again to talk about the latest news with Bing and Google penalties and algorithms at SMX East in New York on Nov. 13.

Frédéric Dubut (lead of the spam team at Bing) and I spoke together in a first-ever Bing and ex-Google joint presentation at SMX Advanced about how Google and Bing go about webspam, penalties and algorithms. We did not have the time to address every question from the attendees during the Q&A and so we wanted to follow up here. Below are questions submitted during our session about Google and Bing penalties along with our responses.

Q: Did the disavow tool work for algo penalties or was it mostly for manual action?
A: The disavow tools from Bing and Google most definitely help with manual spam actions. In fact, it is crucial to resolve link related manual spam actions/penalties with the disavow tool. At the same time, if your website has a history of active link building, the disavow tools are also a great way of getting rid of those low-quality links that you can’t remove and are now in violation of the Google or Bing Webmaster Guidelines. While there is no such thing as algorithmic penalties from Google’s side, disavow link data will be used by both Bing and Google as a potential data point for testing the various algorithms that power rankings.

Q: Thoughts or tips on combating spam user’s posts on UGC sections of a site? (reviews, forums, etc.)

A: Vigilance is key while combating user-generated spam and monitoring communities for brand protection purposes. There are some quick and easy ways of mass reviewing or limiting abuse. For example, using CSRF tokens or batch review user submissions by loading the last 100 posts onto one page and skim over them to find the abusive ones, then move on to the next 100, etc. You can also decide to always review any posts with a link before publishing, or you can use commercial tools like akismet or reCaptcha to limit spammer activity. If you don’t think you can commit any resources at all to moderating your UGC sections, you may also consider not allowing the posting of any links. It is important to remember that no tool will stop human ingenuity, which is why committing resources, including trained outreach for employees, is a must if the risk associated with user-generated spam is to be reduced.

Q: How can you tell if someone buys links?

A: It is all about intent and trends. In general, it doesn’t take a thorough manual review of every single link to detect something suspicious. Most often, one quick look at the backlink data is enough to raise suspicions and then reviewing the backlink profile in detail delivers the smoking gun.

Q: With known issues regarding javascript indexing, how are you dealing with cloaking since the fundamentals around most SSR and dynamic solutions seem to mirror cloaking? Is it hard to tell malicious versus others?

A: Actually, if we focus on the intent, why a certain solution is put in place, it is rather easy. In a nutshell, if something is being done so that search engines can be deceived and substantially different content is displayed to bots versus users, that is cloaking, which is a serious violation of both Google and Bing Webmaster Guidelines. However if you want to avoid risking being misunderstood by search engine algorithms and at the same time provide a better user experience with your Javascript-rich website, make sure that your website follows the principles of progressive enhancement.

Q: Can a site be verified in GSC or BWT while a manual penalty is applied?

A: Definitely. In the case of Bing Webmaster Tools, if you want to file a reconsideration request and don’t have an account yet, we highly recommend creating one in order to facilitate the reconsideration process. In the case of Google Search Console, you can log in with your Google account, verify your site as a domain property and see if any manual actions are applied anywhere on your domain.

Q: Is there a way that I can “help” Google find a link spammer? We have received thousands of toxic backlinks with the anchor text “The Globe.” If you visit the site to look for contact info they ask for $200K to remove the backlinks so we spend a lot of time disavowing.

A: Yes, absolutely. Google Webmaster Guidelines violations, including link spamming, can be reported to Google through a dedicated channel: the webspam report. On top, there are Google Webmaster Help forums, which are also monitored by Google Search employees and where bringing such issues to their attention stands an additional chance to trigger an investigation.

To report any concern to Bing, including violations to Bing Webmaster Guidelines, you can use this form.

Q: Does opening a link in a new tab (using target=_blank) cause any issues / penalties / poor quality signals? Is it safe to use this attribute from an SEO perspective or should all links open in the current tab?

A: Opening a link in a new tab has zero impact on SEO. However think about the experience you want to give to your users when you make such decisions, as links opening in new tabs can be perceived as annoying at times.

Q: Should we be proactively disavowing scrapper sites and other spam looking links that we find (not part of a black hat link building campaign)? Does the disavow tool do anything beyond submitting leads to the spam team? Or are those links immediately discredited from your backlink profile once that file is updated?

A: Definitely, if this is a significant part of your website backlink profile. Spam links need to be dealt with in order to mitigate the risk of a manual penalty, algorithms being triggered or even undesirable Google or Bing Search team attention. The disavow tool primarily serves the purpose of being a backlink risk management tool for you and enabling you to distance your website from shady backlinks. However, a submitted disavow file is merely a suggestion for both Google and Bing and not a very reliable lead for active spam fighting. Whether search engines abide by the submitted disavow file or use it in part or not at all is up to each search engine.

Q: How is a cloaking penalty treated? At the page level, sitewide. Can it be algo treated? Or purely manual?

A: Cloaking is a major offense to both Google and Bing, given its utterly unambiguous intent, which is a deception of the search engine and the user. Both engines are targeting cloaking in several complementary ways – algorithmically, with manual penalties, as well as other means of action. The consequence of deceptive user-agent cloaking is typically complete removal from the index. Google and Bing will be trying to be granular in their approach, however, if a website’s root is cloaking or the deception is too egregious, the action will be taken at the domain level.

Q: If you receive a manual penalty on pages on a subdomain, is it possible that it would affect the overall domain? If so, what impact could be expected?

A: It is possible indeed. The exact impact depends on the penalty applied and how it impairs a website’s overall SEO signals once it has manifested itself. This is something that needs to be investigated on an individual site level. If you end up in a situation where you have a penalty applied to your website, your rankings will be impaired and your site’s growth limited. The best course of action is to apply for reconsideration with the search engine in question.

Q: Do Bing and Google penalize based on inventory out of stock pages? For example, I have thousands of soft 404s on pages like these. How do you suggest to best deal with products that go out of stock on large e-commerce sites?

A: No, neither Google nor Bing penalizes sites with large volumes of 404 Not Found pages. Ultimately, when you have any doubt about the legitimacy of a specific technique, just ask yourself if you’d be comfortable sharing it with a Google or Bing employee. If the answer is no, then it is probably something to steer clear of.

The problem here is that with a lot of soft 404s, search engines may trust your server and/or content signals significantly less. As a result, this has the potential to have a major impact on your search visibility. One of the best ways to deal with out of stock items is to be using smart 404’s, which offer users a way to still find suitable available alternatives to the item currently unavailable while at the same time serving a 404 HTTP status code or noindex to users and bots alike. Talk to an SEO professional to discuss what the best strategy is for your website because there are a number of additional factors (e.g. the size of the website, available products and duration of unavailability) which can have a big impact on picking the right SEO strategy.

Have more questions?

Do you have more questions for us? You are in luck because at SMX East this year we will present the latest about Bing and Google penalties and algorithms. Be sure to join us at SMX East!


Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.


About The Author

Fili is a renowned technical SEO expert, ex-Google engineer and was a senior technical lead in the Google Search Quality team. At SearchBrothers he offers SEO consulting services with SEO audits, SEO workshops and successfully recovers websites from Google penalties. Fili is also a frequent speaker at SMX and other online marketing events.





Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com