Quality Content & the Search Engines
Quality Content & Achieving Success in the Search Engines
When a client approaches Direct Submit or any SEO / Internet services provider, without exception everyone wants their website to reach as high as possible in the search engines result pages. Most business or individuals who want to have a higher ranking website on Google or the other search engines have certain specific keywords for which they hope to have the top, or at least near the top results in the major search engines.
How the client expects these rankings to be achieved is frequently discussed with many ‘suggesting’ they don’t mind how these results may be achieved. Many website owners have resorted to such questionable practices as keyword stuffing, copying content from other sources, and other similar things. Google decided to try to keep these sorts of sites from getting the highest rankings by changing up their algorithms. So how did Google respond; with the now famous (or infamous, depending on your point of view) Panda and Penguin algorithm updates.
The Panda update was released first, and its purpose was to move the sites up that had more good quality content and did not rely so much on other SEO techniques without offering any really useful content. Since people tend to prefer sites that are actually useful and not just pages of adverts or endless links, Panda was created to try to serve them the kind of pages they wanted.
There was a problem with Panda, however. The problem was that, when people realised what the focus was, some began using available public domain content to add content to the pages. Unfortunately, this made the web pages look like they were content rich without actually offering anything unique or of any real value to their visitors. This resulted in the creation and implementation of the Penguin update. Penguin looked at the content that will be shown “above the fold,” or the area that is at the top of the page and shows without scrolling.
When a site had unique content there and a minimum of dubious copy (adverts, links etc), its rank was increased. If a site had duplicate content (such as the public domain text mentioned earlier) or an excess of adverts or links, it was ranked lower in the search engines.
The most effective way for any website owner to optimise his or her site for the new Google algorithms is to offer unique and useful content throughout the website. When a site has valid and useful content on its pages, more visitors will find it useful, return more often and view it in appositive way. They will also likely tell their friends and colleagues about it, with the more people sharing the website, the better its ranking will be – likely due in part to quality inbound links.
Creating a site that becomes a valued resource by the end user is one of the cornerstones to successful SEO. It proves the worth of the site both to other users, and to the search engines.