Cloaking

from Wikipedia, the free encyclopedia

Cloaking (Engl. Veiling ) is a technique for search engine optimization , in which the web crawlers of search engines under the same URL another page is presented as the visitor. It is used to improve ranking in search engines and indexing .

The aim of cloaking is to present a page that is optimized for search engines and visitors at the same time. Since search engines primarily process and index text content, a text-based, structurally optimized HTML page is delivered to them. Human visitors who use a browser , on the other hand, receive a page optimized for them for the same URL. This can, for example, contain multimedia content such as Flash films or videos, the content of which is invisible to search engines. Another reason for cloaking is to protect optimization methods from the eyes of the competition and thus avoid copying content or techniques.

Cloaking is also used to disguise fraudulent advertising . While a user sees the fraudulent page on the landing page, the web crawler sees a harmless page.

techniques

User-agent cloaking occurs if different content is presented depending on the user-agent . By contrast, IP cloaking (also called IP delivery ) is used if the content is differentiated according to the IP address .

search engines

Cloaking is prohibited under the guidelines of most search engines. Violations that are discovered usually result in permanent exclusion from the index. Cloaking can be exposed by search engine operators calling up the website with normal browser identification and a neutral IP address and comparing the content with that of the search engine. A trademark and competition violation occurs when third-party trademarks are used on bridging sites and these are hidden by means of cloaking.

To make cloaking more difficult, search engine operators do not make all the names of the web crawlers public. This is to prevent the crawler from being assigned to a specific search engine. In addition, several web crawlers are often used, an official one such as Scooter from AltaVista or Slurp from Inktomi , and unknown, unofficial ones , which may even identify themselves as browsers on the side. In this way, search engine operators can find out whether simple user-agent cloaking is being used.

Professionally carried out IP cloaking is difficult for the search engines to unmask because the web crawlers are recognized on the basis of IP addresses, reverse DNS and Class C networks . Current IP lists of the web crawlers are a prerequisite for functioning IP cloaking.

Individual evidence

  1. Leonie Sontheimer: The trick with the celebrities. Invest like Philipp Lahm or Dieter Bohlen? With fake reports and counterfeit news websites, scammers lure unsuspecting users - and rip them off. In: Zeit Online. Zeit Online GmbH, July 11, 2020, accessed on July 11, 2020 .
  2. Helgo Eberwein, Competition Law Aspects of Domains and Search Engines, Nomos, Baden-Baden, 2012, ISBN 978-3-8329-7890-7 , p. 146 f.