Firms need to be aware of search problems that can arise from structural issues with their websites, one expert has claimed.

Writing for Search Engine Watch, Josh McCoy said there are a number of common issues regarding coding that can create a search engine optimisation (SEO) "disaster".

He noted the robots.txt file, which is used to prevent some non-search data being picked up, as a frequent culprit of SEO faults.

However, Mr McCoy said: "This file also has the potential to wipe an entire site out of the search engine's indices."

Implementing any Disallow: / instructions on a top-level directory or images folder will prevent all sub-folders from being picked up, he explained.

Other common mistakes include the misuse of the Nofollow HTML code and robots meta tags.

Recent research from Econsultancy showed companies are ramping up their SEO services as the economy emerges from recession, with 60 per cent of businesses reporting they will increase spending on natural search this year.

News brought to you by ClickThrough – experts in Search Engine Marketing & Internet Marketing.
 

Did you find this page useful?

Comments

About the author:

ClickThrough is a digital marketing agency, providing search engine optimisation, pay per click management, conversion optimisation, web development and content marketing services.