The Google Disavow Ultimate Guide


Since its introduction in 2012The Google Disavow tool has become an indispensable tool for website operators and online marketers. It is the only way to positively mitigate backlink risks and keep signals off the page. Used correctly, the Disavow Tool becomes a one-way communication channel with Google. The wider application of the Disavow Tool and the vast amounts of data it still generates for Google remain a source of speculation. Its main purpose, at least from the point of view of the website operator, remains the dissociation of their site by the PageRank that transmits backlinks that could otherwise reduce the visibility of the organic search. However, when used carelessly, the Disavow Tool can spell doom for Google Rankings.

This guide, influenced by the author's work experience while working for Google Search, both penalizing offending sites and lifting Google's penalties, is an attempt to clarify common misunderstandings surrounding the Google Disavow tool and its application. At the same time, it is an effort to ask commonly associated questions and provide a plan for anyone considering using the Disavow Tool to improve their website signals.

Disable or not deactivate?

Obviously, there is a central question: does every website need a disadvantaged file? The unequivocal answer to this question is resounding: no! Not every website needs a convincing convincing file. In fact, on the scale of things, relatively few websites need to actively monitor and manage their backlinks. Sites such as personal blogs, government platforms, charities, non-government sites, even small niche or local stores do not have an acute need to disadvantage spam links. The reason for this is that these websites rarely perform link building. Many thrive in direct traffic and have neither the ability nor the desire to improve their Google rankings. Often, their target audience is aware and familiar with their presence. Therefore, type traffic represents almost all the traffic it enjoys. As a result, they rarely actively follow PageRank, making backlinks that Google, in turn, could frown on. With relatively few backlinks in total, disavowal is practically a problem.

It is a completely different situation when you look at commercial sites, such as online stores, price comparison platforms, marketplaces, media, portals or big brands. Their imperative business intent makes them susceptible to optimization, which may or may not always be in line with the Google Webmaster Guide. Google remains firmly in the mind of Google, not worth backlinks based onTherefore, backlink risk management is an essential part of conducting online business for these websites. They must use the Disavow Tool as a shield to protect their organic classification.

There is also a situation where everyone the affected site must use the Disavow Tool in their defense. This is when there is a manual Google spam action, such as a manual penalty for creating links. As a general rule, any Google penalty should be removed as soon as possible. However, a penalty for backlinks should be applied immediately, as it progressively affects the position of the website in Google search results.

Any website retained by a manual penalty due to spam links must use the Disavow Tool.

When to deactivate?

There are a number of important factors, but only two main factors to consider if backlinks can be a liability for the website: the volume of backlinks received and their quality. The first indicator can often be measured almost instantly, analyzing the total number of backlinks reported in trusted third-party instruments, such as Majestic. For example, the minimum number of backlinks ever registered example.com It is the total fresh and historical total combined. Although no single tool is able to provide an exact figure, about 300 million backlinks is a substantial number that, in the case of a commercial website, can justify reviewing and updating the disagreement file. Similarly, other data collection tools recommended later in this guide are unlikely to detect all the backlinks that will ever exist. Like all other commercial tools, you may not be able to identify private backlink (blog) networks created specifically to avoid third party detection. PBNs, however, are a playful concept from the beginning. Linking is done explicitly for Google, backlinks must be detectable by Google and therefore always represent a clear responsibility for the site rankings.

The Google Search Console, while indispensable in this process, cannot be considered a final data source due to the limitation of reporting, limiting samples to 100,000 links. That being said, when looking for a tangible threshold, 100,000 links can be taken as a general rule. Fewer backlinks do not guarantee the effort required for decommissioning. There may be multiple backlinks.

The second main indicator – quality – is significantly less simple to be found, much less to evaluate accurately. The quality of the backlink depends on the type of anchor text used, the distribution of the anchor text, the quality of the content surrounding the backlink anchors, and where otherwise the same page links. In short, it can only be analyzed by experienced human experts, armed with powerful, purpose-built tools that help speed up the process. However, no instrument can fully replace this intensive and detail-oriented approach. Manual analysis and investigation of backlink quality requires crawling date backlink in a critical first step.

In the short term of the entire process, there is an additional indicator that can help assess how much of an acute risk a backlink profile can pose: anchor text distribution. Although there are no hard-to-notice thresholds, the basic rule is that the more ten anchor texts seem optimized for the specific products or services offered, the more likely PageRank is to be achieved at some point. And therefore, fresh legacies and backlinks pose more risks. Several tools provide information in this regard, including Ahrefs and Majestic, with varying depths. However, the evaluation of all anchor texts, with the exception of the 10 tops, is superfluous, as commercial anchors tend to surface anyway.

The distribution of top 10 text is very commercial, as in this example, it is usually a strong indicator of backlink debt.

What to disable?

Any backlink analysis needs to start with aggregation as much relevant backlink data as much as possible. Google Search Console backlink samples are an important step in this regard. As mentioned earlier, they are limited, effectively reducing their informational value for websites with substantial backlink profiles of 10 million backlinks and more. While it is possible to boost GSC sample production by adding a lot of models next to domain ownership, such as with www. and without www., both https and http, and a combination of all of the above subdomains and possibly other subdomains or directories, these steps remain a work issue. Google's continued insistence on this GSC tests are sufficient for any eventuality it is true, but only for relatively small websites. Sites that have accumulated substantial backlink profiles over time should not rely solely on GSC samples. An effective but time-consuming option may be to collect backlink data samples from other webmaster tools for search engines.

Similar to Google Search Console, Bing Webmaster Tool BETA The feature provides a free look at backlink data. On a side note, Bing also allows webmasters concerned about Bing rankings to disassociate their sites from unwanted links, not unlike Google. However, this final Google Disabling Guide focuses exclusively on Google best practices.

Other search tools for webmasters, such as Yandex or Baidu it can be useful to still collect backlink data over a period of time. Although the former seems to take a long time to accumulate and display backlink data, the latter is a formidable language challenge for many site operators.

As mentioned, none of the search engines provided data samples, mentioned alone or even combined, is not complete enough when reliable data is needed. In a critical situation, it is of the utmost importance that other third-party data samples are also considered to generate a sufficient sample of data for actionable results. Here is again ideal a multitude of different tools such as Ahrefs. Majestic. SEMrush. Ryte and LinkResearchTools should be used to accumulate and verify as much backlink data as possible. However, this redundant approach costs money. The few high-quality services available come at a price that can become substantial once individual backlink data exports exceed 100 million backlinks. This modest number also dispels any lingering doubts as to whether Google Search Console backlink evidence might be sufficient for a backlink analysis. The Google search console covered with 100,000 backlink examples is a sample of 0.1% of a 100 million backlink profile. Hardly enough to clean past a website of problematic links.

When the potential data collection process is completed, the collected samples are deducted and filtered to expedite the next manual review process, as necessary. Zero impact backlinks, e.g. indicating landing pages that are excluded from robots.txt sites or backlinks that have a rel = "nofollow" attribute can be dropped immediately. These are irrelevant from the perspective of spam risk assessment.

Both attributes rel = "sponsored" and rel = "ugc", which, at the moment, remain very little applied novelties, do not require any differentiated approach. These are exclusive to Google and are not recognized by any other search engine, which is why, if used at all, they must be applied in tandem with the industry-wide standard.

Similar to backlinks that have no impact, there are websites that can legitimately cross links, using even the most commercially anchored text they can imagine, as they share ownership. In other words, a website operator or organization is free to link to its websites without the risk of violating the Google Webmaster Guidelines. Hence the whole the portfolio of websites / domains should be excluded from further analysis.

In contrast to this, subjective high authority sites or a reputable brand should not be labeled similarly. There are a lot of misconceptions around authority, most emphasizing an elusive DA value or a domain authority. The term so often thrown around is actually not relevant to Google. Experience shows that sites or brands that are supposed to be complied with frequently violate Google's link policies. A fact that was not lost on the Google team. Therefore, backlinks that come from what can be considered respectable sources should not be evaluated in any other way.

Equally high risk backlinks that are to be completely eliminated, changed or not observed must, however, be included in the assessment process. If they are considered a threat, they should all be included in the disadvantage file, despite the fact that they could have already changed. Google may not have accessed them yet. As a result, changed signals may not be reflected in Google data and algorithms, making these backlinks a persistent threat. Whether and when these backlinks are reviewed depends on the allocation and management of the individual crawl budget. And it can take a while, especially for low quality sites.

When the analysis starts, the backlinks purpose, why they came into existence and not the property or origins remain at the center of the investigation. Google doesn't ask or care how exactly the spam links came about, who created them, or when. Therefore, it is not necessary to record additional information for further processing or documentation.

With fresh backlink data at hand, high-probability spam links can be grouped. While the backlink profile of each site is different and in constant change, there are backlinks that can be considered with 100% certainty as a liability and safely disposed of. Template spam, including automatically generated sites with no content, scratches, or shaken content, are firmly within this group and should be disabled.

High risk TLD

No TLD can be considered 100% spam. None should be disadvantaged just because of the TLDs that link the sites. However, there are some TLDs that tend to attract excessively spam. Spamhaus Project 2020 2020 Top 10 statistics are quite relevant in this regard, even if the list is not exhaustive. Backlinks from websites hosted on domains such as ca .tk, .gq, .top, .ml or .loan to name just a few of the many TLDs that cannot be directly rejected as spam, but can be filtered and grouped in the backlink review process. Often repetitive patterns in them domain name or URL structure, identical templates and content, easy to remember content helps to carry out a rapid risk assessment.

The situation is similar, but a little more nuanced, with spam-free hosting services. Each free host of course includes a few low quality websites. However, some notoriously fail to get rid of the automatically generated spam, which in the past triggered a collective punishment response from Google. Perhaps a few publicly observed courts prove that Google did and they care about free hosts. It is not necessary to prevent free host services in the deactivation file, however, when websites hosted with dodgy are identified during the backlink analysis, it is not necessary to proceed with caution. Free hosted spam backlinks justify including the entire service using the: operator to disadvantage file domain.

Some TLDs are prone to attracting spam websites that need to be disabled.

Expired domain backlinks

While investigating backlink risk levels, some sites or, in this case, domains are more easily recognized as spam and therefore harmful than others. Expired domains, which were previously legitimate websites, were abandoned by their original operators only to be revived with scraped or template content, in the hope of benefiting from the reputation built in the past. These are a clear violation of the Google Webmaster Guidelines and black hat smoking pistol SEO. Consequently, all expired domains must be rejected. Such sites are automatically generated almost universally and are therefore easy to spot and filter. Where there is even a shadow of a doubt, Internet archive offers invaluable and free services, showing most previous registrations.

An example of a legitimate website before the domain expires.
And the same site after the domain has expired. Such websites are generally disadvantaged.

Links to hacked sites

Similar to expired domains, all legitimate but compromised, e.g. Hacked sites that knowingly link through the injected code without the legitimate consent of the operators must be rejected. Since this method as a trend is declining, usually even for very high backlink profiles, there will only be a number of hacked websites in their backlink profile. This is the only group that can be periodically reviewed to assess the situation. Sites that have been cleaned of backlinks and injected content can be safely removed for the disabled file, but this is an optional step.

Press releases backlinks

PageRank passing press links, especially those containing commercial anchor texts as mentioned in particular in Google Link Guidelines, all must be included in a convincing convincing file. Google has once again highlighted its position on building the press link and maintains its position that this is a clear violation of the Webmaster Guidelines. Veteran Google employees, including John Muller they repeatedly repeated on the specific point.

Affiliate backlinks

Google doesn't look distrustful affiliate websites in general. However, PageRank affiliate backlinks are considered a side boost because they are not deserved in terms of Google. Consequently, when managing backlink risks, all affiliate backlinks must be included in the opt-out file. Sites with coupons or special offers, as well as price comparison platforms, are worthy of special attention. By far, not all of these legitimate services choose to intentionally ignore Google Linking Guidelines, but the ultimate responsibility for verification remains the same as all backlinks with the site operator in question for ranking on its Google sites.

Directory backlinks

All SEO directories and links must be disabled. There are currently no legitimate reasons to make exceptions. There are countless gifts that betray the only purpose for the existence of directories, which is PageRank. These include the fact that almost no directory is moderated, lacks topicality or any oversight, and often even the domains used highlight SEO or links, rather than any other editorial value. Although frequently and correctly viewed as a legacy issue, directory links remain a responsibility for websites even many years later.

Directors have lost their credibility and should be rejected.

Backlinks off-topic forums

As mentioned earlier, PageRank that sends spam links does not have an expiration date. As long as they exist, they continue to be a threat. This is why even methods of building truly antique black hat links, such as SEO directories and off-topic forum spam, need to be included in the analysis. In particular, the latter offers serious potential for damage, with effective software solutions such as xrumer on the market for a decade. Of course, not all forum backlinks are harmful to the targeted website. On the contrary, relevant, community-based and forum-moderated references can be wonderful. And most importantly, drive and convert traffic. However, they are in a manual review, easily separated from spam entries by junior members who have little substance and no position in the community. This last type of spam backlinks must be disabled.

Paid blog backlinks

While the types of link spam mentioned above are usually quickly identified based on similarities, these are paid blog posts that require further examination. The decisive factor for determining the risk that cumulative paid blog posts present is intention. In other words, where the quality of the linking site, the depth of its content, its unrestricted linking policies, and the type of anchor text used and the landing page that the backlinks indicate are intended to benefit from PageRank switching, such as backlinks. presents a risk. They must be at a disadvantage. While paid blogs are rarely made in terms of quality, while relying on tested CMS and reliable, inexpensive hosting, such as paid WordPress blogs, do not necessarily display the same level of uniformity that allows efficient filtering in the process. revision. Like all the types of spam mentioned especially in this guide, the paid blog backlinks require manual review to prepare a risk assessment and to build a convincing convincing file.

If in doubt, there are a few questions that can help determine if a backlink is a risk contributing factor and should be disabled:

  • Is the backlink passing PageRank?

If it is not, either because of a rel = "nofollow" attribute or because the landing page is robot.txt edited, it is not a problem. All subsequent questions become obsolete. If your backlink passes PageRank, asking the following questions can help:

  • Has the link been paid for?
  • Create and control your backlink?
  • Has the backlink been generated automatically?

If any of these questions are answered with a positive answer, the backlink is probably a risk factor.

Finally, the key question that can help determine whether a backlink is as legitimate to Google or not is as much: Would you be comfortable sharing this link with a competitor or Google Search employee? To this last question, the answer is not infrequently able to sort the wheat from the chaff.

How to reveal?

Once the analysis is completed a few, important steps remain. Google provides some guidance regarding formatting and disabling the set file limitation. The two most obvious limits, which have a maximum file size of 2 MB and a total of 100,000 URLs, are few and far between, but the strongest websites link spam to the site. Avoid individual, granular and patterns always using the domain: operator at the site level (e.g. domain: example.com) are best practices. The correct file format of the .txt document, which must be ASF-8 or 7-bit ASCII, remains the last important point to consider. As long as it ends in .txt, the file name used is irrelevant.

Website operators can check the validity of their disavow file using only free of charge Disavow File Testing Tool website, created by former Google engineer and colleague of the author Land Search Engine Fili Wiese. However, no instrument can detect contextual errors, such as the disadvantage of its own areas, which can and should legitimately have a networked link. This is why a final check should not ensure that relevant, legitimately legitimate domains belonging to the same site operator are not included.

A new disagreement file should be tested before submission.

Another hack in the last step is to send the disadvantage file individually on all Google Search Console verified models, including all combinations, such as with www. and without www., https and http, as well as any other subdomains and directories. While Google officially recommends focusing exclusively on primary ownership, this seemingly redundant effort can help protect your site from unwanted links in a rare case where Google systems fail, whether temporarily or critically.

A completed and double checked deactivation file is best filed without delay. The backlink profiles of all websites are constantly evolving and changing. There are new backlinks, desirable ones, as well as spam backlinks. This is why disadvantaged files degenerate over time. This process happens faster for websites with already relatively high backlink profiles. This is why a completed disagreement file is a temporary remedy. The value of which decreases over time.

A new or updated update file is best sent individually to all verified GSC models.

What's next?

For Google, an individual disadvantage file is a recommendation of the website operator, not a directive. However, if you choose to follow this recommendation in whole or in part, it is not disclosed through the Google Search Console. For the website operator, the same disadvantage file is a temporary protection shield. How long it can provide a certain level of trust depends on the subsequent growth of the backlink and its quality. Although there are some general rules to follow, a disadvantaged file needs to be reviewed and updated based on new data, at least once a year, as part of an annual maintenance cycle. When that repetition is repeated, there must be previously disadvantaged and recently detected anti-spam backlink patterns. combined into a newly disabled file before loading. Simple uploading of new patterns will inevitably and irreversibly delete previously sent spam backlink patterns and thus nullify the good work of the past.

The sent disavow file has at the same time no impact on traffic conversion coming from the included links. In other words, while the backlink risks are mitigated, any traffic you may cause from the same suspicious backlinks is not affected.

Updating a disadvantage file can lead to an increase in traffic.

There are no reliable methods for predicting traffic trajectory after disabling spam backlinks in Google. Three scenarios are clearly possible. Website traffic can stagnate, increase sharply as backlink ballast is removed or significantly decreased. However, giving up questionable links does not mean increasing traffic. This is a technician on and off the page audit is for. Unsubscribing helps keep the site off-page signal input. At the same time, the very fact that Google penalizes websites for building PageRank transfer links undoubtedly demonstrates that link building, including creating risky links, in violation of Google Webmaster Guidelines, can function. Above all, passing PageRank links as a ranking factor is here to stay and should be considered an important SEO signal, especially for content discovery, search crawlot prioritization, user navigation and traffic conversion. There are reverse risk-oriented alternative approaches to creating backlinks, which are equally How to build connections article, which offers a new perspective. One that, if fully embraced, can also help reduce the need to disable backlinks in the near future.


The views expressed in this article are those of the guest author and not necessarily the search engine. The staff authors are listed here.


About the author

Kaspar Szymanski is a founding member of Look for brothers and well-known search expert who specializes in recovering websites from Google penalties and helping websites improve their ranking with SEO Consulting. Prior to founding SearchBrothers.com, Kaspar was part of the Google Quality Quality team, where he was the driving force behind global initiatives to combat web spam. He is the author the ultimate guide to Google sanctions and part of the Ask the SMXperts series.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *