Google

Learning From The Panda 3.3 Update

Written on:March 19, 2012
Comments
Add One

We’re still learning of the effects of the Panda 3.3 update, and every day seems to bring new and sometimes painful news. Our Panda 3.3 Update post covered some of the background and basic rules that need (or ‘needed’; as many people have already fallen foul of these problems,) to be followed to protect yourself.

First, understand that Google are a commercial organisation, who make billions of dollars from advertising. Whatever anyone says; it’s NOT in Google’s commercial interests to continue to allow natural rankings to perform as well as they have in the past. Their PPC revenue directly increases every time there is a Google algorithm ‘slap’, as marketers scramble to regain their traffic. And the changes to their algorithm seem to be getting ever-more heavy-handed – exactly at the time when their financial targets are not being met.

The scariest effect of all these changes is their movement towards accepting a kind of ‘negative’ SEO. i.e. in their haste to try and stop heavy spamming, they’ve instigated algo’ changes which indiscriminately attempt to add ‘reverse-weighting’ to links they don’t like. This has already spawned a new Black-Hat industry where you can hire teams of unethical spammers to try and ‘knock your opponent out’ by huge-scale mass-spamming of links to their site (hundreds of thousands and even millions of links); usually using exactly the same effects that we recommend you to avoid in our other posts (over-optimisation and over home-page linking etc.) This can be particularly effective on newer and more fragile sites; although it does backfire sometimes on well-established sites – by further strengthening their rankings.

Google never used to work this way; it they didn’t like a link, they just ignored it. This was the safe and responsible thing to do. By adding reverse-weighting, they’ve opened up a can of worms that we can only hope will be closed due to market pressure very soon.

So, what can we do to protect ourselves going-forward, in the light of Panda 3.3. discovery? And what advice do we have for those dealing with penalisation, ranking drops and Google Notices?

Using Buffer Sites & Web 2.0’s

The simplest and most direct way of adding a layer of protection is to build an ever-increasing number of buffer sites to sit between any commercial link-building you do and your main sites.

If you build 10-20 ‘asset-quality’ Web 2.0’s each month, and have a couple of links on each that point to your key sites, then these act as ideal buffer sites. You can then fire a large quantity of links at them, because they’re public properties.

And when we say ‘asset-quality’, we mean on-topic, with good quality content & construction, and low outbound linking. You should consider these as extensions to your own sites, and treat them with the same care and attention. Remember, you want these to stick long-term and to provide the kind of image that will easily pass any Google Manual inspection. They will also obviously need to be built manually to meet these standards. They don’t have to be huge – and 1,000-1,500 words of quality original copywriting spread over 3 posts is ample.

Any other high-PR or high-trust links that you’ve had for a long time (1 year plus) are also good candidates, as these pages are well and truly tested by Google’s algorithm.

We’ve always highly recommended the use of 2.0’s and buffer sites, as our long-term clients will know only too well! But now, we feel it’s very important to point out that this probably provides the only remaining simple level of protection from Google’s algo’ changes.

From now onwards, we very strongly urge everyone to start using 2.0’s or other authority pages as ‘insulation’ from the vagaries of Google’s decisions. This will also automatically protect you from a range of other SEO ‘mistakes’, and should therefore become part of EVERYONE’s future SEO strategy. This leads us onto…

Expanding Your Web-Estate & Lead/Sales Generation Funnels

You should aim to add to these Web 2.0’s and authority pages over time; and don’t see them as one-offs. Building more and more of these will expand your web-estate, and you’ll find that these actually start to rank ‘in their own right’ over time. Expanding this collection of external assets will eventually provide a wide base of mini-sites that you can freely back-link to, without any worry of future problems.

NOTE: You can also use lower-quality ‘disposable’ 2.0’s (that are essentially glorified articles) as backlinks to power-up your asset 2.0’s. And then these disposable 2.0’s can have a small shot of 50-100 links pointed at them for indexing and juicing-up. These disposable 2.0’s can have more posts and more outbound links, and you’ll worry less about deletion, as you won’t be building many links to them.

Then you need to start on expanding your ‘owned’ assets. One site is always at risk. Ten sites spreads that risk greatly. Put it this way; if you generated 10% of your leads or traffic from 10 separate sites, how much more protected are you? Contrast that with owning a single site, where 1 URL generates all the traffic for one keyword or long-tail. A single Google slap will wipe you out in one move.

You probably have a good idea of what keywords are worth ranking for, but you should constantly revisit this and expand upon it; it’s surprising how things change in the marketplace. Then set about building some smaller themed information sites that cover a portion of your services, (maybe a small subset that are written from an external perspective – rather than as the business ‘owner’) and then SEO them as individual assets.

It’s nothing more complicated really than having multiple online SEO assets for each of the keywords you want to rank for; and they can cross-over/overlap. So you could have 20 mini-sites which overlap with deep pages. Each one is targeted at a slightly different area (from the site/home-page and ‘theme’ perspective – and each looks different and has its own content; a unique site in its own right, albeit a small one.) But they have inner-pages which target similar keywords across them. So maybe 5 of them (from different sites) target the same long-tail, and another 5 (from another 5 different sites) target another long-tail. Then, over 20 sites you have 5+ pages that you’re SEO’ing for each term.

In other words: keyword redundancy and asset protection.

In time you’ll find yourself ranking with multiple slots on pages 1-3; meaning you get a bigger bite of the available traffic as well. The ideal situation is to have several listings on page 1 and the same on page 2. And maybe a few lurking around pages 3-5. Now THAT’s redundancy!

And the sites should be mixed up and individual; not just ‘cloned’ sites targeting the same keyword list. They all need to look and feel like individual professional sites. And it’s just a matter of ‘coincidence’ from Google’s perspective that some of these sites have pages that target and seek to rank for some of the same keywords. And of course they should be across different hosts, IP’s and registrars; so as much as possible they look like unconnected sites. (And you should probably also avoid Google analytics, AdSense and webmaster tools etc. to not make it obvious!) Basically, you end up competing with yourself, rather than other marketers.

When someone clicks for a quote/more information/contact/booking/sale etc. it can take them to either your main site (possibly risky for obvious reasons), or a private page on a private server which is not indexed by Google – to keep it all separate and unconnected. These private servers are fully Google and SE ‘blocked’; and they have hundreds of landing pages that match the theme and style of the site they come from. It’s only the URL that changes – and it’s a ‘general’ URL that just looks like a corporate data/payment-server – not a keyword-type URL – something like ‘GRZAnalytics.com’. (The home-page, if ever visited, would just be a simple login screen.) Most people won’t even notice that they’ve jumped servers/sites because the theme stays the same. This is where you’ve got to start being clever to maintain a consistent user experience.

Remember; large corporates do this all the time. You get frequently and ‘unknowingly’ bounced through several payment/data-collection/tracking servers – and many on https secure non-indexed sites; Google expects this. There are lots of things you can do here if you get creative and put in some thought.

We never said it would be easy or without effort. In fact, as with all successful business models, it will be a lot of work. But that’s where people who do this win; because they could be bothered to build their online business in the same way that a successful offline business would – by expanding their marketing estate and sales funnel. Please see our other related post: A Wake-up Call For SEO.

Dealing With Penalisation & Ranking Drops

You first need to ensure you understand the difference between de-indexing, penalisation and other ranking drops.

De-indexing usually only happens where a site is doing massive outbound linking (link farms) or where there is a combination of negative factors like: Very low quality content and layout, linking to and from ‘bad/spammy neighbourhoods’, and bad advertising/legal practices etc.

If you enter ‘site:xxx.com’ into Google (replacing xxx.com with your site name – and don’t leave a space after the site: command) and you get nothing back… You’re either not indexed yet (a new site) or you’re entire site has been de-indexed by the Google team. This is severe – and getting it back will be a long uphill struggle, usually involving large-scale changes to your site. (It’s often not worth it – so you start again with another domain and learn a lesson.)

Note: See the Google ‘Search Operators’ Guide for more useful Google commands.

Penalisation normally happens from over-optimisation of keyword terms, and usually only affects those specific search terms. i.e. You’ve built hundreds or thousands of links to a new site, all with the same exact anchor text, which is also in the site URL (An EMD/PMD – or Exact/Phrase Match Domain), and it’s in the Page Title, H1 tags and loaded into the content too. This is the ideal example of a newbie mistake in online marketing; thinking that ‘more’ is better, and is one of the catalysts for Google’s Panda 3.3 update; over-optimised content & anchors (they work as a pair.)

You will often see that other pages on that domain are either unaffected, or are even ranking better, due to an increase in overall domain links. Sometimes that same page will also move up the rankings for another long-tail keyword, indicating that you’ve been penalised for a specific set of keywords; usually from over-optimisation.

Google can apply a number of automatic or manual penalties, (and it’s anyone’s guess how this really works behind the scenes,) but an automatic penalty will usually be seen as a ‘fixed’ and fairly stable drop of X positions. If you suddenly drop 5, 40 or 50 positions and stay there, then you MAY have an automatic penalty. Manual penalties often come with a notice/email from Google.

Automatic penalties can often be removed fairly quickly (sometimes days to weeks; but it can also take months,) by fixing the issue. The problem is that you’ll rarely have any clear indicator as to what the issue is!

Running Market Samurai (or other similar) link reports to analyse the anchor-text’s and spread of inbound linking will often tell the story. If you see a massive predominance of home-page or exact-match keyword anchor linking, then it’s highly likely that you need to de-optimise. You can build more links over time with a wider array of anchors, and build them deep into your site to spread the link-profile.

You can also make minor modifications to your actual pages to not make them look so optimised for those keywords. (Page Titles could have your company name put in front for example, to move your keywords away from the 1st word, and they could be extended to use a full 7-8 words to further ‘dilute’ the phrase.)

The most important factor is not to get ‘freaked out’ about a ranking drop. It may just be a few days or weeks of ranking ‘bounce’, and if your site is less than 6 months old, then this is very likely. If your site is young and you’ve only had the ranking for a short time, this may just have been a ‘honeymoon’ period allowed by Google to quickly rank news-worthy items. Then you’ll plummet and slowly start to work/bounce your way up the rankings over time.

Also, consider that you may have dropped for a number of perfectly ‘legitimate’ reasons:

  • Maybe your competitors have knocked their SEO up a gear and pushed you down.
  • Maybe there are new rankings that have come in above you.
  • Maybe you’ve changed something on-page (perhaps inadvertently) which has caused that page to drop.
  • Maybe you just lost some (or a lot of) important inbound ranking links. This one can catch a lot of people out, and looks like you’ve been penalised; but you’ve simply lost many of the links that were giving you your ranking.
  • Or maybe your content isn’t being seen as fresh or useful, and your site has become stagnant (and we mean this purely from Google’s algorithmic stand-point; don’t even get us started on the stupidity of this requirement. Why on earth should every business have to constantly update their site and supply new and completely superfluous content – just to create ‘freshness’ and ‘social buzz’!)

…But you see what we mean. There really is no ‘one-size-fits-all’ solution. You’ll have to work at it and analyse what information you can get hold of. That or hire a professional SEO to do it for you… But most people will obviously choose to do this themselves.

Dealing With Google Notices

If you get one of these, then understand that Google sent out some 700,000 of these during the first quarter of 2012. You’re not alone! The first step is: Take a deep breath and start collecting data. Try and get some specifics out of Google (not easy unfortunately.)

You’ll normally receive a completely arbitrary ‘blanket’ reason like ‘unnatural linking practices’ – as though that tells you anything; except to point even more at Google for having the audacity to claim that they determine what ‘natural’ linking practices even are!

You’ll need to look at reports, like we detailed above and try and locate what the issue(s) could be. This will take time and effort; and if the site is just one of a whole slew of assets (like we suggested above), then it may be easier to just walk away and dump the site/pages in question.

Google will sometimes ask you to ‘remove’ external links that have been built, as though this were an easy thing to do (and they’re quite aware of this; it’s just their way of tying you up in another ‘quest’ to distract you from pointing the finger at them.) The reality is that the vast majority of links you ever build for rankings purposes will be sites that you have absolutely no control over. We’re in the same boat. When we build links, we can’t just go back and ask the site owner to remove the pages/links. 99% of the time, you either won’t be able to locate the links, or there won’t be anyone to contact. And even if you could locate and contact the sites, the likelihood of them caring enough to make the changes are ‘minimal’, to say the least.

And don’t ever use an SEO/link-building tool, product, network, company or service as an excuse to Google. You may as well admit that the reason you were speeding is because you were drinking…

“Sorry occifer… hic!…I was only doing 130 mph ’cause I wanted to get home quickly… on account of my inebriation…(Big smile)”

…This will only exacerbate the situation and they’ll have even less time for you; because now they can no longer have any illusions of your ‘innocence’.

The Google ‘drones’ are indoctrinated with the shiny Google philosophy of how socially-responsible they are – and how they’re fighting the good-fight against a world of spammers. None of these service representative will have ever attempted to run their own business, or build something for themselves, so don’t expect them to have any sympathies with you whatsoever. Most of them seem to see it as a holy crusade in our experience. You just can’t argue against it; they’ll adopt a moral superiority that will take your breath away sometimes. Your best option is to be humble, sincere, and plead ignorance – and then hope for a break from someone in a good mood.

Conclusions

The bottom line is, when you decide to use SEO, you take on the responsibility of using that technique. You’re effectively saying that you’re attempting a ‘short-cut’ to rankings. In Google’s eyes, you may as well have just signed your own execution order. It’s black and white to them.

In business, nothing is without risk. Whether that risk is pushed on us ‘fairly’ or otherwise, it’s irrelevant. We have to deal with the market conditions as they are. Or else move on, and do something else.

SEO is one of those risks: It can provide great rewards, but it can also cause us problems and heartache.

The key is to mitigate/dilute these risks through diversification and scale.

Be Sociable, Share!

Leave a Comment