It was inevitable and had to come to this sooner or later. While it’s a cliché that necessity is the mother of invention, Clichés are clichés for a reason. As bloggers and web based businesses vamped up their domains for greater SERP’s with extensive link-building tactics, search engines had to come up with programs that would meticulously judge the inherent worth of the content and provide a comparatively coherent search result as opposed to previous search criterions which ranked highly marketed sites way up above the sites providing worthy content.
It is an unsaid convention that if Google doesn’t notice you, chances are no one else will either. Calling a spade a spade, Google is the demi-god amongst search engines and other search engines (even the highly efficient ones like Alexa or Technorati) have but a fraction of the same traffic that Google boasts of. And Google being the ocean of innovation that it is, came up with Google Panda. Readers who are yet to discover the weight of phenomenon that is Google Panda can concur from the first paragraph that it has something to do with the providing a content heavy search result and downgrade advertisement sites in the page rankings.
To the technical cognoscenti, it’s a search algorithm added to the list of search criterion that Google already has and will filter content based on the worth of the data provided. (Just out of sheer curiosity; why did Google decide to christen an upgrade and spread the word around? Marketing, perhaps?).
Months after its release, a string of rolling updates (Version 2.3 was recently unveiled) followed, thereby notching up its efficiency. The saving grace maybe that the updates have be conformed to take into account the popularity of the site as well as the audiences feed signals.
This turns around the rules of SEO quite significantly (which will also be discussed later in subsequent blog entries). Plain link-building just doesn’t suffice anymore. It is now essential that the site has some worthy and significant content to boast of (not just mass-text, but also quality content). While the existing techniques for SEO may still apply, the emphasis on the content heavy pages with extensive data on the given subject will increase the value of the domain and this is what radically alters the traffic pattern. This was well exhibited by the drop in traffic in-flux of almost 12 percent of all sites.
This will give an edge to Web-content writing services and other content providing business setups. But the update does have specific targets and instantly downgrades low-quality websites and reduces spams (which is called for, considering the generation of 4.5 mn URL’s per month in 2011). But, taking note of the 12 percent which has been affected, it will be easy to find a lot of sites affected by the update. I’m sure one can also notice certain homogeneity in such sites.
My guess is that most of the “12 percent” will mostly consist of sites with poor usability and structure as well as sites which are designed specifically for advertising and have poor product pages. I don’t believe one need mention that most of these sites will have uniformly low quality content, in which case it was just a matter of time until they got depreciated in their rankings.
For the rest, it is essential that they elevate the worth of their content unless they be relegated to SERP purgatory. Ye be warned, This Panda has claws!