Google’s Penguin update has sent the SEO industry into something of an ironic flap.

Almost a fortnight after Penguin went live, webmasters are still moaning and complaining that Google got it wrong.

And in some cases, the detractors have a point.

Penguin was designed to destroy the dark arts of SEO. Black hat tactics – such as keyword stuffing, or paying for spammy inbound links – have been utilised by cheeky webmasters attempting to manipulate the rankings. And Google’s had enough.

Penguin aimed to torpedo sites which used keyword stuffing to trick spiders into thinking their content was relevant, whilst also blacklisting sites which had paid for dodgy incoming links in an attempt to falsify credibility.

But Penguin clearly hasn’t worked exactly as Google planned.

Search industry forums are reporting examples of apparently ‘white hat’, honest websites being downranked, with examples of horrific, spammy ‘black hat’  sites suddenly displaying on the front page of Google for completely unrelated search terms (see below).

The thing is, though, is that Penguin hasn’t altered much in reality. It’s certainly not a game changer.

Google has always had good practice guidelines. But it hasn’t always had a way to police whether sites are adhering to those guidelines.

Penguin, therefore, is essentially a Google search copper, plodding the everlasting beat that is the results pages, looking for traces of the notorious Internet crimelords Webspam and Spammy Links, and attempting to bring the perpetrators of online offences to justice.

Now, the problem. It’s bit like a search version of RoboCop – Penguin appears to have been appointed judge, jury and executioner by Google.

And that, it seems, is where the problem lies. Penguin is programmed – it has been coded to look for telltale signs of black hat SEO. It’s not a human, and it’s not capable of rational thought (unless Google’s keeping something from us).

As such, Penguin was always going to be prone to mistakes – especially after first launching. That in itself should account for the ‘funny behaviour’ reported by webmasters immediately after Penguin went live. Some went as far as to claim Penguin “broke Google” – others petitioned for the update to be reversed.

Google realises that placing arbitrary decision-making into the hands of a dumb robot isn’t going to reap foolproof results and is prepared to reinstate accidentally downranked websites.

Webmasters who feel unfairly punished by Google can flag up their complaint. Those meeting Google’s good practice guidelines should be reinstated as a result. Conversely, you can also report instances where spammy, rotten sites are returning high in the results when they really shouldn’t be.

This level of teething problems and fall-0ut clearly wasn’t part of Google’s intention with the Penguin update.

Google wanted Penguin to go some way towards levelling the playing field for search engine optimisation. Some feel it’s an attempt to push people towards paid ad-based Internet marketing.

It’s clear Google still has some work to do, either way.

And until then, those genuine sites which have lost rankings – and business as a result – are going to have to weather the storm.

  • Have the Rules for White Hat SEO Changed After Penguin?
    A Sideways View On The News, by Ali Harris, content manager for ClickThrough Marketing

Penguin wasn’t a game-changer – it was simply a means to enforce the ‘rules’ already laid out by Google.

Those who had got away with breaking, or just bending, the rules for some time have now been penalised. That may’ve meant some previously top-ranking sites suddenly plummeting.

Anyone adhering to Google’s best practice guidelines, on the whole, will have avoided a hammering from Penguin.

The cases where legitimate sites got downranked are few and far between, and Google has set up the right channels to rectify this. If Google refuses to reinstate a site, chances are, there’s some spam, links or some other ‘black hat’ problems somewhere.

That said, respondents on a handful of search forums have provided examples of spammy, rubbish websites which are now appearing on page one of Google.

One great example is to search “Paypal France”. The first page of results for this search returns no fewer than three websites selling viagra.

Not only are these sites totally unrelated to the search term “Paypal France” – they’re also stuffed with keywords.

In terms of content, it’s nasty. Really nasty.

In fact, these search returns are exactly what Google engineer Matt Cutts said Penguin would whitewash.

Yet, it hasn’t.

Even the page description, displayed directly under the website URL, shows how badly stuffed some of these pages are.

One reads: “During relative of pele’s observing sugar in brazil there was no rheumatoid film viagra paypal france.”

It’d be hard for anyone to argue that this search result:
i) makes any sense
ii) is of any use to anyone, ever
iii) is not blatant, keyword-stuffed spam
iv) should be on page one of Google for any search term other than “examples of ridiculous spammy content”

It’ll be interesting to see how these anomalies iron out in the coming weeks, and whether Google refreshes Penguin so it looks a lot more closely at the factors which might separate a genuine site from a fraud.

If they do, may I suggest this level of closer inspection should henceforth be known as ‘observing sugar in Brazil’?

  • Why call the Google algorithm update Penguin?

The last big algorithm change from Google was called Panda. This one was Penguin.

Speculation is rife that Google is following a pattern with its search engine updates.

The obvious bits are: animals (cute ones at that), which begin with a ‘p’, and are black and white.

Guesses for next update name include Panther, and Pigeon (derived from vowel use: pAnda, pEnguin, pIgeon etc…).

On the black and white theme, some have posited that Google is separating ‘white hat’ tactics from ‘black hat’ tactics via the use of bestial metaphor.

I have my own theory behind the name, which takes us back to the ironic flapping of the SEO industry.

Take Google’s ‘average user’ – someone with little knowledge of anything. Google plays to the lowest common denominator.

If you don’t know anything about quantum theory, and you Google it, you’d want something reputable, trustworthy and reliable to return on the front page of the results. The same goes for any search term.

With webspam sites, you might get a top search result which says “Quantum Theory” on the page name, and includes the phrase in the description too. But on closer inspection, a bunch of other, unrelated words are in the description. This is known as keyword stuffing.

Click the link, and you won’t find a repository of sparkling information about relativity, worm holes, or physics. No. You’ll probably get a bunch of bad links, nonsense sentences, and the odd advert for a miracle diet instead.

In search terms, that result is useless.

Now imagine you’re looking for a bird. You’d expect a bird to fly, right?

Only, Penguins can’t fly.

So perhaps, Penguin was designed to root out sites which seem genuine, which look like they are fit for purpose, but, on closer inspection, are actually technically useless. Like a Penguin’s wings.

Or maybe I’ve overcomplicated it.

News brought to you by ClickThrough – experts in SEO, Pay Per Click Services, Multilingual Search Marketing and Website Conversion Enhancement services.

Did you find this page useful?

Comments

About the author:

ClickThrough is a digital marketing agency, providing search engine optimisation, pay per click management, conversion optimisation, web development and content marketing services.