On October 17th, Google released the much discussed ‘Link Disavow’ tool in their Google Webmaster Tools (GWT).
This seemingly innocent tool allows a webmaster to report a link that they don’t want ‘counted’ towards their SEO/organic ranking, so that you can get rid of ‘unwanted’ links. The webmaster community seems to be raving about this, but I don’t think many people have really considered the huge ramifications.
There are several disturbing issues that this tool raises. First; how does any webmaster know WHICH links to disavow?
Knowing which links are contributing to your ranking, and which links may be hurting your ranking is an almost impossible question. In reality, most webmasters will pick the ones that they suspect ‘could’ be bad. Innocent Webmasters everywhere will now be disavowing whole swathes of links that they’re worried might have caused their ranking drops since Penguin & Panda. They’ll inadvertently trash their own rankings by removing many of the links that where actually supporting their position. You’ll notice that Google doesn’t actually tell you WHICH links are causing problems; they want you to lash out in the dark and ‘own-up’ to all the link-building you’ve been doing.
If Google REALLY wanted to help webmasters, then they’d stop this ridiculous practice of reverse weighting links, which caused a whole new industry of negative-SEO to spring up. Then you wouldn’t need to disavow links, as they just wouldn’t count in the first place. Essentially, Google have added a tool to ‘fix’ a problem that they created, (but one which they can now claim is combating negative SEO – and thus earn them political brownie-points.)
Now, they’ll not only know that you’re doing SEO (which they don’t like at all,) and can start tracking your sites more closely, (especially if you disavow a lot of links,) but they’ll also know all the sites that you’re using to obtain links from. In one swift move, they’ll have used ordinary webmasters to highlight most of the SEO resources that were available to them.
The only real argument for the tool is to disavow links that you haven’t created yourself, if you’ve been the victim of SEO-Vandalism or Negative-SEO; but this wouldn’t even be necessary if they didn’t enable negative SEO in the first place!
The key issue here is that in using this tool, you’re owning-up to using SEO and link-building for rank manipulation. And for most webmasters, you won’t be disavowing a couple of links, but hundreds or thousands. Just think of the ‘SEO intelligence’ this will be providing Google on a plate. I strongly suspect that we’ll see another run of manual ‘unnatural link’ penalty notices being sent out by Google over the coming months, as Google analyse your link-building history (which you’ve now owned up to!) It will be such obvious manipulation by a webmaster at this point, that Google will possibly ‘tag’ sites that have disavowed volumes of links for full manual review.
I believe the sites that will be hit the hardest, are those that have a lot of links from spammy ‘content-less’ pages featuring large volumes of outbound links, i.e. blog comments & forum mini-posts etc. These will be the links that people will obviously disavow the most, as they’ll believe these to be the low-quality ones. Proper articles, well written content, 2.0’s and press releases of a decent size (400-500 words or more) with only a few contextual links will be seen as the higher quality links.
My honest advice is to completely ignore this ‘tool’ (which is actually a ‘honey-pot’ or trap); it gives way too much information to Google about you and your link-building history and habits…
And to protect yourself going forward, do what I’ve been preaching for years now; use large 2.0 properties, press releases and public sites (which can’t be penalised en-masse, as they have so many legitimate uses) as 1st tier web-assets with links that point to your sites. Then build a bunch of your usual links at these assets. By inserting a ‘buffer’ tier, you’ve removed the necessity for such large-scale direct-linking; which is invariably what causes ranking/SEO issues. This should have become your priority ever since Panda 3.3 at the end of February, so nothing has really changed in that essence.
…And of course, ALL your direct-linking should come entirely from good-quality content (at least 400 words or more) with minimal outbound contextual linking, and it should be from highly diverse sites and IP addresses with naturalised anchor text.
I’ve generally taken the stance of avoiding any use of GWT, or registering any of my sites to this service, as the minimal benefits are far outweighed by the information you’re freely providing Google. I, for one, don’t want to make it any easier for Google to determine my site footprints and ownership. If you wish to use the ‘disavow links’ tool then you’ll need to be registered with GWT; which is yet another reason to avoid the service as far as I’m concerned.
Good luck with your SEO!