Lisa Coghlan reveals part three in our SEO Jargon Buster series: filled with terms that you may come across in the complicated SEO industry. Missed parts one and two? Check out the link at the end of this post.

Discover the remaining phrases, terms and buzzwords from the SEO industry in this third and final instalment of our SEO jargon buster series.

PageRank

PageRank is a metric that measures the importance of a web page. The metric was invented by Google’s Larry Page and Sergey Brin, and attributes a measure of 0 to 10, taking into account the page’s authority, reliability and importance. PageRank used to be a determining factor of where a page would rank in Google’s search results. However, it is now one of hundreds of ranking factors that Google uses to determine a page’s popularity, but it will not directly determine the ranking position of a particular page.

Panda

panda

This cute fluffy creature refers to Google’s ranking algorithm update regarding the quality of on-page content. The Google Panda update refers to a search filter added to the algorithm in February 2011 intended to stop websites with little or no content, or content that is of low quality or duplicated from other sources, from reaching high ranking positions in the SERPs. Panda has since been updated in order to target sites that may have abused keyword usage in the past, deliberately scattering keywords all over the content in order to rank, but not actually benefitting their users with the information they’ve put on the page.

Penguin

Google introduced the Penguin update in April 2012. This addition to the algorithm saw Google targeting unnatural backlink profiles, where sites may have paid for backlinks to their sites in order to improve their organic position in search results. This is now frowned upon, as it encourages unnatural links that do not reflect the quality of the content on the site. The Penguin algorithm assesses sites continually to ascertain whether unnatural links are pointing to the domain. And if you’re found to have links that appear unnatural, you could be penalised with your site’s organic visibility dropping dramatically.

Redirect

pointing arrow

A redirect is when one URL forwards you onto another URL – a way of sending users and search engines to a different URL from the URL that they originally sought out. In accordance with SEO best practice if you do want to redirect users to a different URL from the original one they sought out, you should use a 301 redirect. This means that it is “moved permanently”. This makes it clear to users and to search engines that the content you want them to see is on the URL that you have redirected them to. This will pass any link strength that the original page had onto the new one. A 302 redirect (temporary redirect) acts in the same way, however should only be used for temporary redirection. An example of this could be when your site is down for a length of time for maintenance. The URLs can be 302 redirected to a ‘sorry we’re undergoing maintenance’ page.

Rich Snippet

A rich snippet is the structured data markup that can be added to HTML in order to give search engines more information about the page they’re crawling. This additional information allows Google to display richer search results about that particular page giving users more information from the SERPs and can enhance click-through rate (CTR). Without additional structured data markup, a page’s basic information will usually be displayed in SERPs to include the meta title and description you allocated to that particular page. With additional markup in the HTML you could include reviews and star ratings, biographical information, addresses and phone numbers as well as video content and much more.

Robots.txt

WallE robot

A robots.txt file is used by website owners to communicate with web robots and search engine crawlers. Also referred to as the robots exclusion standard, or robots exclusion protocol, the information that is put into the file will instruct web crawlers to either crawl or not crawl certain sections of the site. Unless you instruct search engines not to, they will crawl all the pages possible on your site, but if there are certain pages on your site which do not need to be crawled the following command can be used to do so:

User-agent: *

Disallow: /folder/

You can also specify that you want your site to be accessed by web bots. For an in depth look at the correct way to use your robots.txt file read our SEO blog: A Guide To Using The Robots Exclusion Standard.

SEO

Search engine optimisation is the practice of affecting the visibility of a website within a search engine’s organic or unpaid results. SEO is sometimes referred to as organic search, as there is no payment made to the search engine to include the web site or web page within their search results. Instead, the site’s visibility is improved naturally by adding original content to it, fixing any technical problems users may be experiencing on your site, and gaining natural backlinks from people who share your content.

Site Speed

This refers to how fast or slow your website loads, so that all content is completely visible to users. Site speed and page speed is a large factor in determining how your website performs organically. It’s also a factor that will inevitably deter users from using your site, especially if large images or Flash is slowing down your site speed. Checking your site speed can be done using Google’s PageSpeed Insights tool. Once you have a better idea of your site’s speed you can begin to identify what could be slowing down your load time, and how to rectify it.

Social Signals

heart signals

These signals are the views, likes, shares and pins you receive from users that are checking out your social media accounts. These act as a vote of confidence for your brand and your website, and this concept of a vote of confidence is now known by SEOs as a social signal, helping you to gain organic visibility. Increase your social signals by presenting users with an easy opportunity to like and share your content – add share and like buttons to your blog, or your newsfeed. Encourage social sharing in your content, and let Google know that people are giving you a vote of confidence. That way, Google will recognise your content as an authority in your industry, and hopefully improve your site’s organic visibility.

Tiered Link Building

We took you through the fundamentals of link building in part two of the SEO jargon buster. Now, let’s take a look at tiered link building. This is the process of building several tiers of links to your website. How does this work exactly? Well, let’s say you’ve built up a nice set of links to your website – all of which feature high quality content on decent domains that is relevant to your offering. In turn, this is boosting your backlink profile and sending some good quality link juice to your pages. In order to increase page authority, you then work on building links to your backlinks (these tier two links will then enhance the power of your tier one links). Of course this process can continue but you wouldn’t want to do this in such a way that is considered as an unnatural linking process.

User Generated Content

typing hands

Quite simply, user generated content is any content on your site that has been created by users visiting your website. This could be a blog comment, a review or a feedback form – all of which could play a significant part in your SEO strategy, especially for e-commerce sites that don’t feature a lot of informational content. A review of your product acts as an indicator to search engines that your website is active and in use, providing a valued service to users. Moreover, UCG can be a great source of long tail keywords that you haven’t already got in your content, helping to increase ranking positions around certain keywords. And more importantly, this information is an invaluable insight into your customers’ desires and fears, helping you to answer any questions they may have with more valuable content such as blogs posts or FAQs.

XML Sitemap

This sitemap is a file that lists all webpages on a site, helping search engines gain a better understanding of the overall structure of the site, aiding future crawls of the site as well. The XML sitemap can also include additional information about how often the information on a page changes and how important particular pages are on the site. Although XML sitemaps are not obligatory, they can help the crawling process, especially if you have a particularly large site where it’s likely that a crawl depth limit will be reached before the entire site is crawled via the linking structure. In this case, you can use your XML sitemap to show crawlers which pages are vital to your website and your users.

Learn more vital industry terms in our previous SEO jargon busting blogs – part one and part two – running through terms A to N.

Did you find this page useful?

Comments

About the author:

Lisa Coghlan is a digital and content coordinator at ClickThrough Marketing. She writes SEO content for many of the company’s larger clients, and assists in the implementation of content strategy. She writes live gig reviews and has a mild obsession with finding new documentary podcasts.