Panda 3.3 update - some thoughts from our SEO team

Just over a year ago, Google launched its much-talked-about Panda algorithm update. Panda was initially designed to filter out websites with ‘thin’ content that added little to the search experience. Over the course of the last year, there have been various tweaks to the Panda system, culminating in late February with the roll out of version 3.3. In a post on its Inside Search blog, Google announced that over 40 changes to search quality have now been implemented.

Just over a year ago, Google launched its much-talked-about Panda algorithm update. Panda was initially designed to filter out websites with 'thin' content that added little to the search experience. Over the course of the last year, there have been various tweaks to the Panda system, culminating in late February with the roll out of version 3.3. In a post on its Inside Search blog, Google announced that over 40 changes to search quality have now been implemented.

Among the many updates, there are a few which have elicited strong reactions from the SEO industry - particularly on the unknown link evaluation method that Google has 'turned off' (unlikely to be any of the major evaluation methods such as 'NoFollow' or Anchor Text in our opinion). And also on the 'Venice' update which 'improves the triggering of Local Universal results by relying more on the ranking of our main search results as a signal.' - the principal learning from this being to ensure that Places pages have a good level of content and have as many 'citations' as possible.

However, there's one section of the update that will could possibly have the greatest impact on Google's serps, namely:

"Improved local results. We launched a new system to find results from a user's city more reliably. Now we're better able to detect when both queries and documents are local to the user."

What this means is that for certain keywords (it is unclear how these are selected), organic results will change depending on the location you are searching from. For example, search for 'coffee shops' from Glasgow and four local listings are pushed into the top 10:

If the same search is performed from Manchester, three local results from this location are included:

This update is potentially beneficial to smaller businesses with a local 'bricks and mortar' presence to gain a foothold in the SERPS, where they would not otherwise have featured. However, there is a strong argument that (in many cases) the results returned are not the best quality, and are not what users will expect to see for their search queries. For example, when someone searches for 'SEO' should it be assumed that they are looking for a localised service? Or are they just looking for authoritative information on the topic itself?

Whatever the case, it goes without saying that for businesses of all sizes, it is now crucial that (in order to maximise organic presence) there is as much detailed content as possible included on websites based around physical business locations.

We have to assume that all of these updates are constantly evaluated by Google, and if it's felt that the results aren't working positively for users then action will be taken to deal with them.