Share your valuable feedback, comments or suggestions on Black SEO
Black SEO

The page the spider will see is a bare-bones HTML page optimized for the search engines. It won't look pretty but it will be configured exactly the way the search engines want it to be for it to be ranked high. These ‘ghost pages' are never actually seen by any real person except for the webmasters that created it of course.
When real people visit a site using cloaking, the cloaking technology (which is usually based on Perl/CGI) will send them the real page, that look's good and is just a regular HTML page.
The cloaking technology is able to tell the difference between a human and spider because it knows the spiders IP address, no IP address in the same, so when an IP address visits a site which is using cloaking the script will compare the IP address with the IP addresses in its list of search engine IP's, if there's a match, the script knows that it's a search engine visiting and sends out the bare bones HTML page setup for nothing but high rankings.
There are two types of cloaking. The first is called User Agent Cloaking and the second is called IP Based Cloaking. IP based cloaking is the best method as IP addresses are very hard to fake, so your competition won't be able to pretend to be any of the search engines in order to steal your code.
User Agent Cloaking is similar to IP cloaking, in that the cloaking script compares the User Agent text string which is sent when a page is requested with its list of search engine names (user agent = name) and then serves the appropriate page. The problem with User Agent cloaking is that Agent names can be easily faked. Search Engines can easily formulate a new anti-spam method to beat cloakers, all they need to do is fake their name and pretend they are a normal person using Internet Explorer or Netscape, the cloaking software will take Search Engine spiders to the non – optimized page and hence your search engine rankings will suffer.
To sum up, Search engine cloaking is not as effective as it used to be, this is because the search engines are becoming increasingly aware of the different cloaking techniques being used by webmasters and they are gradually introducing more sophisticated technology to combat them. It may be considered as unethical by Search Engines if not used properly.
User Agent Cloaking is similar to IP cloaking, in that the cloaking script compares the User Agent text string which is sent when a page is requested with its list of search engine names (user agent = name) and then serves the appropriate page. The problem with User Agent cloaking is that Agent names can be easily faked. Search Engines can easily formulate a new anti-spam method to beat cloakers, all they need to do is fake their name and pretend they are a normal person using Internet Explorer or Netscape, the cloaking software will take Search Engine spiders to the non – optimized page and hence your search engine rankings will suffer.
To sum up, Search engine cloaking is not as effective as it used to be, this is because the search engines are becoming increasingly aware of the different cloaking techniques being used by webmasters and they are gradually introducing more sophisticated technology to combat them. It may be considered as unethical by Search Engines if not used properly.
Invisible Text – Black SEO
Invisible text is content on a website that is coded in a manner that makes it invisible to human visitors, but readable by search engine spiders. This is done in order to artificially inflate the keyword density of a website without affecting the visual appearance of it. Hidden text is a recognized spam tactic and nearly all of the major search engines recognize and penalize sites that use this tactic.
Tiny Text – Black SEO
This is the technique of placing text on a page in a small font size. Pages that are predominantly heavy in a tiny text may be dismissed as spam. Or, the tiny text may not be indexed. As a general guideline, try to avoid pages where the font size is predominantly smaller than normal. Make sure that you're not spamming the engine by using keyword after keyword in a very small font size. Your tiny text may be a copyright notice at the very bottom of the page or even your contact information. If so, that's fine.
No Spamming – Black SEO
A couple of years ago spamming may have worked wonders for your website. However, with sophisticated algorithms being developed by all popular search engines, spamming can only backfire. Algorithms, these days, can easily detect spam and not only ignore your website but also ban your website.
Besides, instead of spending considerable time and effort on spamming you can always follow other proven strategies and have a higher rank with most search engines. Spamming can also easily irritate readers. Think about it — if your homepage has unnecessary repetitions of a particular keyword, it is bound to frustrate a reader. Consequently, your site, instead of being content-rich, would be junk rich. This can have nothing but a negative impact on your business.
Besides, instead of spending considerable time and effort on spamming you can always follow other proven strategies and have a higher rank with most search engines. Spamming can also easily irritate readers. Think about it — if your homepage has unnecessary repetitions of a particular keyword, it is bound to frustrate a reader. Consequently, your site, instead of being content-rich, would be junk rich. This can have nothing but a negative impact on your business.