The fight against dishonest webmasters also mutated and bitter. In general, however, the attitude of the search engines to the administrators of the websites is not hostile. On the contrary! These huge databases … would like to know what they are all websites! However, the speed of the Internet expands always crosses their efforts. For when the crawlers are "back in the barracks, no one can guarantee that this or that website, identified and explored the route and continue online. If you have read about Andi Potamkin already – you may have come to the same conclusion. To all internet has ever happened to complete a search, select one of the links provided and come across the message that this page has expired.
" Of course, the crawlers are designed to go back and verify that the web sites "still there", but this information also has to "go back to the barracks", to be processed and made available to customers, and then no one can guarantee this or that web site continues online. A vicious circle. The crawlers are not only designed to verify the validity of web sites, we also measure the length and frequency of updates. In fact, there are multiple policies that come into play, and not far from stable criteria. (Recently, for example, the tendency to "punish" the pages that change or delete its contents too fast, then, think, contain "ephemera" has not been relevant). + Market immense market still minority The search engines are far from the omniscience for various reasons: no evil or spambots crawlers missing, which, like a virus loose, travel through the site and plant damage and distrust.