Monday, February 15, 2010

Cloaking Bad or Not?

Cloaking is a simple script that is meant to customized pages being displayed for each person browsing a website. Location, user restrictions, gender, age are some of the criteria to limit or at least control the viewing experience of every visitors. Google themselves applies cloaking. For example, if you will see different search result for a given keyword from www.google.com.us and www.google.com because google applies cloaking techniques. There is nothing wrong with this technique.

But then in the blackhat world, this technique can be very very useful to dramatically improve the SERP of a website. Cloaking is being exploited to hide garbage webpages that are only meant to be seen by search engine crawlers. What I mean is cloaking is employed by spamdexers to hide garbage html webpages from the viewing public. For example, I will load thousands of keyword stuffed html webpages but when people tries to view them it only redirects them to the main webpage so people would not see anything wrong with it but if the search engine crawlers check the webpages there is no redirect and will just continue crawling the contents preloaded with phrases targeting a particular keyword. What happens is that even though the main website contains very few sentences, it would still get very high ranking on search engines.

Well this is bad because search engine crawlers are not really that intelligent to recognize webpages that makes no sense at all. It would be so much better if they improve the crawling algorithm using Artificial Intelligence technologies just the Intelligent Sentence Analyzer algorithm that was blogged by a friend of mine.

No comments:

Post a Comment