Has the Penguin 4.0 update had an effect?

Our blog post on Google’s shiny new Penguin 4.0 update provided an overview of what SEOs can expect from this latest Google release. Now that Penguin 4.0 has now been released into the wild for a couple of months, we take a look at how you can see if your domain has been affected by this cute aquatic bird search engine quality update.

No news is good news…

In the past, Google updates have affected website rankings to varying degrees. There have been a few instances where the SERPs were a markedly different place the next day after implementation – not to mention the odd time when SEOs would be feeling a mixture of fear, uncertainty and doubt in the wake of an update.

The updates that make the ground shake are, however, the exception rather than the rule. Despite some hyperbolic-sounding names, the vast majority of updates a) aren’t intended to pull the rug from under us, and b) generally only affect a relatively small number of sites.

A case in point was Mobilegeddon. With a name like that you would think that we’d been heading for a SERPs sea-change, a whole new world. In the event, it was pretty much business as usual.

And in late September 2016 the Penguin – now a 4 year old Penguin – waddled towards us once again. Did anyone feel the wrath? So far, it seems not. Only a very small percentage of sites reported adverse effects, with a very large amount reporting nothing at all. Tumbleweed in the Antarctic? A lot of Penguin 4.0 conversation had that kind of feel.

 Ian Mc Cartney Blog Post 1

 Page level penguin – the lowdown

One reason for the low volume of noise around the latest update is that the new Penguin operates at page-level rather than domain-level. Essentially what this means is that – much like an Antarctic iceberg – there’s more beneath the surface than we may realise.

And with an update that’s working at page level, it may not immediately be apparent that anything’s wrong. But it’s important to be granular. Only by looking at specifics will it be possible to ensure that you’ve managed to glide unscathed through the latest (ahem) Antarctic episode.

So … what do you do? It’s time to start investigating.

 Exploration time!

Not polar exploration though, sadly.

Oftentimes your rankings will go up and your rankings will go down within a short space of time. This can happen for plenty of reasons. Maybe the algorithm just got sharpened. Maybe a test of some kind was done and some of your rankings felt a bit of a bump. Maybe you just dropped some freshly crafted compelling content and mixed it with some cool UX improvements – and made the search gods take a shine to you. There is always a reason why stuff happens in the SERPs.

The trick – as all SEOs know – is to be able to spot a trend. Because the sooner you can spot a trend, you can do things to mitigate against adverse effects or support positive ones.

The first thing we need to do is gather up all the rankings for your domain. You track all of your valuable keywords, right? And you look at them regularly? Of course you do. This is what SEOs do. Compulsively. Religiously. Or at the very least, habitually.

Now take a look at your keyword pool grouped into its constituent areas. These will depend on what content your domain provides, but usually it will be groupings such as the following:

  • Product type e.g. ‘dresses’, ‘jackets’, ‘jeans’, ‘shirts’
  • Service type e.g. ‘flight to schiphol’, ‘holidays in inverness’
  • Subject area e.g. ‘cardiovascular exercise’, ‘preparing for a 10k’

 Once you’ve done this, look at the average ranking for each of the keyword groups you’ve chunked out. See any drops in these averages? It’s now time to drill a little deeper.

In the example below, three keyword groups are performing well – with their averages climbing over time (along with some ups and downs on the way).

One keyword group has seen a drop in average eight places during the ten weeks averages we’re looking at, with 60% of the drop happening in the past two weeks. Since these are keyword groups, individual terms within the group may have gone up or down in this time – but we can see here that as a whole our jackets group looks like it’s taking a bit of a buffeting.

Ian Mc Cartney Blog Post 2

The next thing we need to do is fire up Google Analytics and see how individual URLs within that group have fared since the presumed date when the update dropped (illustrated by a blue vertical line in the above graph).

In the keyword group above, we haven’t just been tracking how we rank for a particular set of keywords – we also look at which URLs rank for which terms.

GA will let us look at traffic to the domain as a whole (unfiltered) then let us drill down further, looking at various filtered views (e.g. excluding all traffic from IPs associated with the domain – like head office workers who are looking at the site daily but aren’t prospective customers). Looking at the traffic from organic search we can then single out specific landing pages. A look at each of the landing pages that rank within the Jackets keyword group above should give us a better picture of how we’ve been affected.

In this case, organic traffic to three out of the five URLs associated with the jackets keyword group has dropped by a significant percentage. And it just so happens that these URLs are also the ones that are seeing a drop in rankings. 

So, what happened?

The drop in rankings + the drop in traffic = Penguin 4.0 effect? There in no way of being 100% sure of this, but let’s look at the facts in context:

Rankings are down for a specific area of the site – and not even all of that specific area. Just a subset of URLs. Meanwhile traffic is down for these specific pages as well. Ok, so sometimes the SERPs may be slightly more volatile than others (leading to varying levels of traffic) but what we can see here is a trend. Our URL subset is behaving differently compared to the rest of the URLs in its section and also compared to the rest of the domain. So it’s a fair assumption here that Penguin has seen something it doesn’t like and has acted accordingly.

What do I do to solve the problem?

Get all your backlinks in a list

First things first. Get a list of ALL the links that point to these pages. To be sure you’re getting as accurate a profile of your links as possible, don’t just stick to one tried and trusted backlink research tool. Use a couple of them, and ensure that you have as full a list as possible.

Scrutinise your backlinks

Depending on how many backlinks you have, you may be able to eyeball them manually. Does anything keep cropping up? Can you see a pattern emerging?

Remember that the algorithm isn’t just looking for links on horrible spammy sites with migraine-inducing graphics and a horrible font. It’s also looking for anything unnatural – so the following things should probably ring alarm bells:

  • A large amount of links all coming from the same type of site (e.g. budget fashion blogs)
  • Unnatural link velocity (the rate at which sites accrue backlinks) regardless of linking domain’s authority
  • Anchor texts that people IRL would rarely use e.g. “buy cheap jackets”, “cheap jackets online” popping up with too much frequency to be genuine
  • Zero value/toxic links. You’ll know these if/when you see them. If it looks low value, it usually is. Check out the domain authority of the linking site, whether the linking site actually ranks for any terms, and the linking site’s presumed monthly traffic. Most times your initial hunch is bang on.

And if you have too many backlinks to analyse manually? Don’t panic. If you hook up your backlink research tool to data intelligence software such as Kerboo, you can get a picture of how your backlink estate is affected. In the case below, we have a domain with 22.26% of backlinks that fall into the ‘bad’ category. While this is still a not inconsiderable amount, it allows us to concentrate on manually analysing the problematic ones – and reduce the time it would take to go over every single link (the majority of which are benign anyway).

 Ian Mc Cartney Blog Post 3

Neutralising harmful links

The next step is to make a list of all the backlinks that appear to be harming the domain. In the example outlined above, we found that the jackets section of a site had seen a drop in rankings. When we go through the bad links the domain has accumulated, a number – surprise, surprise – are pointing to the jackets URLs we’ve seen decreased traffic and rankings for.

What we need to do now is neutralise the effect of the bad links. In some cases getting them removed may be as easy as firing off an email to the webmaster of the site where the offending link resides. This can be a very hit-or-miss affair, and the responses tend to fall into three very distinct categories:

  • “Sure, I’ll remove the link immediately…”
  • “Sure, I’ll remove the link immediately. For a fee.”
  • Silence

Luckily, there’s an easier way. Step forward the Disavow Tool. Launched at the end of 2013, it has become one of the essential tools in any SEO toolkit – and it’s difficult to believe now that there was a time when it didn’t even exist! What the Disavow Tool does is signal to Google that the domain owner would like a specified set of links to be effectively ignored – in much the same way as if they were no-follow links.

Here is an example of a valid disavow file. You can put a note in them to let Google know that you have previously attempted to get a link (or links) removed. And you can also disavow entire domains by prepending the entry with “domain:” as we can see in this example of a valid disavow file:

Ian Mc Cartney Blog Post 4

An important note on Disavow Tool, though: use with caution. As Google’s page on backlink disavowal clearly states, “If used incorrectly, this feature can potentially harm your site’s performance in Google’s search results”.

In other words, if your site (or pages within it) are underperforming in the SERPs or have recently taken a tumble, get under the bonnet first and see what’s happening. Don’t assume the worst and go nuclear with the Disavow Tool – all it can do is mitigate against toxicity in you backlinks. What it cannot do is make up for things that may be deficient onsite:

  • Thin content
  • Duplicate content
  • Structural issues
  • Unnecessary redirects
  • Lack of relevant Schema markup
  • Etc.

The Brave New Real-Time AI-driven Auto-updating World

So there we have it. Google’s Penguin now walks among us – and it was announced in mid-October that the update has now fully rolled out. And now that it’s real-time, it can see where things are starting to look out of whack, and take any necessary action immediately. All of which means that hygienic SEO is more important than ever – because any straying from best practise will likely have an adverse effect sooner rather than later.

With Penguin 4.0 what we are seeing (at least arguably) is another step towards the era where the algorithm – Google’s beating heart – makes its own decisions, evolving beyond rules defined and developed by humans. Or as Wired puts it: deep learning has arrived on Google Search.

Penguin 4.0 may only be a small step in that direction – but the implications are obvious: attempts at gaming the system will soon mean fighting neural networks as well as rules set by humans. So stay within the rules, be creative, produce content with passion, wit, insight and user focus. Oh, and monitor your backlinks…

By Ian McCartney, SEO Manager. 
comments powered by Disqus

Equator Glasgow

58 Elliot Street Glasgow G3 8DZ

Equator New York

41 East 11th Street 11th Floor New York 10003

Equator London

124 Tabernacle St London EC2A 4SA


© Equator, Reg No. 198148

UK Trade Mark Registration
N. UK 00003059055

ISO 27001 registered firm ISO 9001 registered firm ISO 14001 registered firm
Microsoft Certified Partner Google Partner Scottish Business Pledge Cyber Essentials