Select Page

With small steps towards great achievements, Google announced yesterday, on 24th of April, the implementation of a “webspam algorithm update” that Matt Cutts, leading Google Engineer, defined as “another step to reward high-quality sites”.

Numerous attempts and even more efforts of fighting against spamming techniques such as link schemes, keyword stuffing or cloaking known in the past, this one was long anticipated, particularly after the last major update, “Panda”, dating back from February last year.

This new algorithm is supposed to result into ranking changes to “provide a great user experience” and to reward webmasters relying on white hat SEO techniques, as opposed to those who artificially increase their page ranks with “black hat webspam”.

The initial prediction made by Matt Cutts, which referred to the need of blocking “over-optimization”, was yesterday corrected with the mention that it is not SEO the one that Google tries to put the bite on, but “bad” SEO or spamming SEO practices.

In order to avoid the disclosure of information that will allow interested parties gaming once again the search results and reduce the effectiveness of this new measure, Google refrained from providing details with regards to what websites will end in the grip or what specific spam filter adjustments were implemented. Nevertheless, an important mention was made in this direction: the algorithm should impact approximately 3% of the English, German, Arabic and Chinese queries and about 5% of those in Polish or other languages widely known for heavily spamming.

To all those still confused, there are four important things that should be kept in mind following the yesterday movement:

  • There is no actual strategic novelty as Google has been fighting against web spam techniques for a long time and webmasters were continuously advised to only rely on fair SEO tactics;
  • The real target are the manipulative optimization strategies;
  • The traditional SEO is, in fact, still encouraged with Matt Cutts himself stating that Google’s “advice for webmasters is to focus on creating high quality sites that create a good user experience and employ white hat SEO methods”;
  • The change is supposed to be even less obvious than “Panda” update, which affected four times more queries – 12% as opposed to only 3%.

 

With other words, as long as pages will no longer overuse keywords while reducing content quality, duplicate content, artificially increase the number of back link exchanges or JavaScript redirections, cloaking and doorway pages, the algorithm will have no negative effects on them.

However, within less than 24 hours from the announcement, there are already voices claiming that Google displays even poorer results for now. Considering that the same situation happened right after “Panda” implementation, like with any other change, it appears that we are all supposed to wait and see what will happen in the future and what other new adjustments Google will bring to light.