We’re always keeping a couple of close eyes on the world of SEO. Here are a few big stories our team picked out over the last week or so:

Google Updates Penguin for the First Time in Over a Year

On Sunday October 19 Google confirmed to Search Engine Land that it began rolling out a new version of its Penguin algorithm the previous Friday (October 17).

Penguins. Lots of penguins.

Source: son_gismo at Flickr.

Penguin is designed to target ‘spammy’ sites, especially those seen as violating linking best practice as laid out in Google’s Webmaster Guidelines. This update has been coined ‘3.0’ by the SEO community, but this name does not necessarily follow Google’s internal release numbering system.

On Monday morning, Google came forth with more information. The update would affect less than one per cent of search queries (far fewer than some previous updates, which have hit up to 3.1% of queries). Google also said the update was going to have a “slow worldwide rollout”, which could take “a few weeks”.

Moz Releases Local Ranking Factors Study; Nothing Much Changes

OK, this one’s from quite a while back in ‘news’ terms, but we thought it was worth mentioning anyway.

On Monday October 13, Moz announced the release of its annual study into local ranking factors. The results? Well, nothing much has changed on last year.

"Things can change"

But in this case, they haven’t. Source: Marie-Chantale Turgeon at Flickr.

The post’s author, David Mihm, summarised the key takeaways, the most interesting of which I’ve paraphrased here:

  1. Behavioural signals such as click-through rate appear to be increasing in importance.
  2. Domain authoriy also seems to be increasing in value as a signal.
  3. Google seems to be getting better at detecting users’ locations, and delivering relevant results.

Moz’s Open Site Explorer Gets an Update; Includes New Link Building Opportunities Feature

In more Moz-related news, Rand Fishkin posted on Wednesday October 15 announcing some updates to its uber-useful Open Site Explorer tool.

As well as getting a visual refresh, the tool has also grown a brand-new feature called ‘Link Opportunities. This feature is designed to remove some of the manual grind from link building.

Using the tool, you can now easily looked for unlinked mentions of your brand and find pages that are linking to your competitors, but not to you. You can also export a list of lost links, ranked in order of importance.

Google Explains Best Practices For XML Sitemaps and RSS/Atom Feeds, Recommends Using Both

A week ago today, Google published a Webmaster Central Blog post detailing best practices for RSS/Atom Feeds and XML Sitemaps.

Detail of London tube map

Source: Annie Mole at Flickr.

The post describes the differences between the types of sitemap, explains which fields are most important in sitemaps, and shows how to best optimise your sitemap for Google.

The biggest takeaway, though, is Google’s recommendation to use both – an XML sitemap and an Atom or RSS feed. As Google’s John Mueller summed up when he shared the post on Google+:

Here’s a blog post that covers the different aspects [of sitemaps], and the important fields to specify. In short: why not use both?”

Enjoyed this post? Want to dive further into SEO? Download your FREE eBook Technical SEO Best Practices: A Marketers’ Guide.

Did you find this page useful?

Comments

About the author:

Tom joined ClickThrough in 2011. Since then, he has developed an expertise in the technical side of search engine optimisation. He’s Google Analytics-qualified, and in his current role as digital and technical Executive, carries out monthly SEO activities and provides technical consultancy for several of the company’s largest accounts.