Search engine optimization

from Wikipedia, the free encyclopedia

Search engine optimization - English search engine optimization ( SEO ) - refers to measures that serve the visibility of a website and its content for users of a Web search engine to increase. The optimization refers to the improvement of the unpaid results in the organic search engine ranking ( natural listings ) and excludes direct traffic and the purchase of paid advertising. Optimization can be aimed at different types of search including image search, video search, news search, or vertical search engines .

Search engine optimization is a branch of search engine marketing .


The first search engines began cataloging the early web in the mid-1990s . The site owners quickly recognized the value of a preferred listing in the search results and quite soon companies emerged that specialized in optimization.

In the beginning, the recording was often done by submitting the URL of the corresponding page to the various search engines. These then sent a web crawler to analyze the page and index it. The web crawler loaded the website onto the search engine's server , where a second program, the so-called indexer , read out and cataloged information (named words, links to other pages).

The early versions of the search algorithms were based on information given by the webmasters themselves, such as meta elements , or by index files in search engines such as ALIWEB . Meta elements give an overview of the content of a page, but it soon turned out that the use of this information was not reliable, as the choice of the keywords used by the webmaster could reflect an imprecise presentation of the page content. Inaccurate and incomplete data in the meta elements could thus list irrelevant pages in specific searches. Page creators have also tried to manipulate various attributes within the HTML code of a page in such a way that the page is better listed in the search results.

Since the early search engines were very dependent on factors that were solely in the hands of the webmasters, they were also very susceptible to abuse and manipulation in the ranking. In order to get better and more relevant results in the search results, the operators of the search engines had to adapt to these circumstances. Because the success of a search engine depends on showing relevant search results for the search terms it was asked for, unsuitable results could lead users to look for other ways to search the web. The answer from the search engines consisted of more complex algorithms for the ranking, which included factors that were difficult or impossible to influence by webmasters. Larry Page and Sergey Brin developed with " Backrub " - the forerunner of Google - a search engine that was based on a mathematical algorithm that weighted pages based on the link structure and incorporated this into the ranking algorithm. Other search engines also included the link structure in their algorithms, for example in the form of link popularity .


Page and Brin founded Google in 1998 and the simple design and better results lists made the search engine a huge success. Google used off-page factors (such as PageRank and hyperlink analyzes) as well as on-page factors (frequency of keywords, metadata, page structure, etc.) for the ranking. The search engine wanted to avoid manipulation like in those search engines that only used the on-page factors. The weight calculated by the PageRank algorithm for a page is a function of the quantity and strength of backlinks . The PageRank shows the probability that a user will land on a particular website when surfing the web and clicking on links at random.

Although PageRank was more difficult to influence, webmasters had already developed link building strategies and tools that allowed them to influence the ranking of the Inktomi search engine . These methods also proved successful for PageRank. Many sites have focused on buying, selling, and trading links, often on a large scale. These link farms often created thousands of pages with the sole aim of link spamming .

In 2005, Google began personalizing search results based on previous searches from registered users. Bruce Clay then explained in 2008 that ranking no longer makes sense due to personalized searches and that discussions about manipulating the ranking of search engines are obsolete, since the same search queries for different users would potentially yield different results.

In 2007, Google was already using more than 200 different signals to determine the listing of specific websites in search results.

In 2007, Google announced a campaign against purchased links that influence PageRank. In 2009, Google announced that measures had been taken to limit the effects of so-called PageRank sculpting. Previously, by setting the " nofollow " attribute on links, it was possible to target the PageRank to pages and thus give them more weight and thus list the link target higher in Google. This attribute was originally introduced by Google to combat link spam. In response to the change, other techniques were soon used that specifically hid links, making PageRank sculpting possible again. In 2009, Google announced that it would use the search history of all users for popular search results.

In 2010, Google introduced the new index system Google Caffeine . It allowed users to search for news, forum posts, and other content shortly after their publication date. Google Caffeine represented a change in the way Google updates its index.

The real-time search Google Instant was introduced in 2010 to make search results appear more timely and relevant. The increasing popularity of social networks and blogs was taken into account with the real-time search and increasingly focused on it.

With the Panda upgrade in 2011, websites that copied page content from other pages were finally punished - so-called duplicate content . With Google Penguin , pages that use manipulative techniques to improve rankings (webspam) were also downgraded from April 2012.

Working method

Search engine optimization is roughly divided into two task areas, on-page and off-page optimization. This categorization is based on the criterion of whether your own page is being edited or you are influencing other websites.

Onpage optimization

On-page optimization includes all content-related adjustments to your own website. This includes the optimization of the page content (also content ) in terms of content quality and structure, formatting, keywords, headings, etc., but also technical aspects such as the header , tags such as the page title and page speed, as well as the internal link structure of the page. The user friendliness is also optimized in this context, since it plays a role in the length of stay. The domain name and the page URL are also analyzed by search engines and included in the ranking. As a rule, on-page optimization takes place before off-page optimization.

Searching for and reading in the content of websites follows well-known HTML standards of the web , which is why compliance with them is the first step in optimization when creating websites. According to Google, the validity of an HTML page does not have any influence on its ranking. Nevertheless, pages that are particularly HTML-compliant can be loaded more quickly by the browser, which is certainly honored by search engines.

The meta element "Keyword", which was actually designed for search engines, is no longer taken into account by Google. The same applies to the meta element “Description”. However, this may (depending on the search query) be displayed as a text excerpt in the SERPs and should therefore be carefully formulated, especially for selected pages.

As part of the constant further development of search engine algorithms, mobile end devices (smartphones, tablets) are playing an increasingly important role in search engine optimization. A website that is not adapted for mobile devices is listed significantly worse, especially in the mobile search, than websites that, for example, have a responsive design . Since July 2019, Google has switched to using the mobile version of the content as the primary source for indexing and ranking as part of its Mobile First initiative.

User signals collected by search engines are also taken into account in the ranking calculation. How often a user clicks on a result, stays on a hit website or comes back to the list of results influences the ranking. So positive signals are generated for Google if the user stays on a website for a long time instead of leaving it again immediately.


In the course of search engine optimization, a separate writing style has developed in this area, which can be described as a search engine optimized style. This follows the "rules" that are to a certain extent specified by the search engine's search mechanisms. These are implicit rules, as these are reconstructed using the success factors of the optimization, because a search engine provider usually does not disclose its criteria for the qualitative classification of the indexed pages. Features of this style include search terms at the beginning of headings, sections and sentences, lists and highlighted words.

The requirements of the search engine for the content of the pages can completely contradict the rules of classic text production. For example, grammatical rules hardly play a role in the search algorithms. In this way, a keyword that is often misspelled can contribute more to optimizing the ranking than a technically correct term. However, this approach is decreasing as Google and other search engines are increasingly recognizing and assigning misspellings independently. In addition, search engines now consider correct spelling as well as correct grammar.

Since these mechanisms are subject to developmental dynamics, this writing style is just as often adapted in order to always deliver the best possible result in the optimization. This means that a page is never optimized just once. Rather, it requires a permanent check of the relevance of the keywords used, since user behavior also changes.

Google itself recommends optimizing websites for people, not for the search engine.

Offpage optimization

The "off-page optimization" takes place far away from the page to be optimized and describes all measures outside the website to be optimized.

It is not enough just to increase the relevance through OnPage measures. A good listing in search engines and a good link popularity is influenced by the quantity and quality of the incoming links to a website ( back link or backlink ). Comprehensive and high-quality content is the first step to “earn” backlinks. Off-page optimization may involve a. to develop a link structure with other websites in order to position oneself better in certain thematic areas. To do this, you can, for example, search for thematically appropriate websites in order to win them for a link partnership . Another measure to find possible link partners is the link analysis of competitors. The use of so-called link research tools can also be worthwhile in order to search through the best subpages of a domain . In addition, the design of the link text (anchor text) of the backlinks is essential for the ranking. Free web applications can be used to examine websites for their potential . Often, small changes are enough to greatly increase the ranking in search engines.

Some methods go against the guidelines of most search engines - they can lead to temporary or permanent banning.

It is controversial whether so-called “social signals” are taken into account for the ranking in addition to backlinks. These are mentions of the website in the form of “likes”, “shares”, “comments” ( Facebook ) as well as “tweets” and “retweets” ( Twitter ). The background to this is that search engines not only analyze the backlinks, but also algorithmically check how intensively users exchange information on a website within selected social networks such as Facebook or Google+.

Search terms

One step in search engine optimization is the selection of suitable search terms ( keywords ). For this one can use freely available databases, such as a keyword database or the MetaGer Web Associator. Furthermore, the use of the Google AdWords keyword tool is recommended, which lists not only related search terms but also the approximate number of monthly search queries per search term.

Often an extensive page is divided into several individual pages in order to optimize them for different search terms. Main and secondary keywords (including primary and secondary keywords) are defined for the respective pages. The search words are combined with the corresponding content.

One type of optimization takes place through so-called landing pages . By clicking on the link, the user is taken to a page specially optimized for SEO purposes. Another method is to create pillar content, i.e. to create one or more overview pages for important individual pages.

Reverse engineering

The leading search engines Google, Bing and Yahoo do not disclose the algorithms for the ranking. These are also often changed to make abuse more difficult or to improve the results.

Search engine optimization also examines the search engine sorting algorithm techniques and attempts to decipher the search results by reverse engineering . It analyzes how search engines rate and sort websites and their content . Often an attempt is made to find correlations between sizes (such as page title, headline) and the search results through statistical studies and thus to draw conclusions about ranking factors.

Some search engine optimizers study these correlations, create their own tests or do other experiments, and share their findings. Patents relating to search engines were also used to obtain information about the function.

special cases

Search engine optimization is not only used in the area of ​​websites, but also for image, video or news searches. The basic principle is the same as with traditional web search engine optimization, but other ranking factors are added while others are dropped.

Academic Search Engine Optimization ( ASEO) is also a special case . Here an attempt is made to optimize PDF files for academic search engines such as Google Scholar and CiteSeer . In addition to other factors, Google takes into account the search term density and - instead of (hyper) links - the mentions in other scientific publications.

Rules and manipulation

Methods that bring irrelevant web pages to the top of search engine results pages are known as search engine spamming ; they violate rules that search engines set up to protect against manipulation of their search results.

One technique used for manipulation is to create text pages especially for search engines and to set up automated redirects to other content. However, this method of working with so-called bridging pages contradicts the guidelines of most search engines. Cases that are uncovered by the search engine operators often result in the banning of the relevant page, i. H. the target pages in question are excluded from the search index.

At the beginning of 2006 , BMW had to accept at short notice that the internet offer of the automobile company was completely removed from Google because a number of automatically forwarding bridging pages had been created. After BMW removed the offending pages, was added to the Google index again.

A variant is the creation of shadow domains. Here, the content optimized for search engines and the redirects are outsourced to separate domains in order to protect the primary domain from possible penalties.

Ethical search engine optimization ( white hat search engine optimization ) avoids spamming. It refrains from illegal practices such as the use of bridging pages or a link farm and follows the directives of the individual search engines. This will avoid the risk of being banned or downgraded in search results pages. In contrast to the ethically sound “white hat” search engine optimization, optimization that includes undesired methods on the part of the search engines is called “black hat” optimization.

Technical limits

Purely graphically oriented pages designed with films, images and graphically embedded texts, such as B. allows programming in Flash , offer search engines difficult to evaluate text code. Since 2008 the company Adobe has provided the companies Google and Yahoo with the technology with which they can access previously inaccessible content in Flash files. However, dynamically loaded texts are not recorded in this way, nor can link texts or various sub-pages be identified within Flash. Programming a website exclusively in Flash is therefore not recommended from a search engine optimization perspective.

There are also areas that are completely invisible to search engines, this is known as the deep web . These can be intentionally blocked pages, for example areas that require user access. However, it can also be content that cannot be accessed due to a lack of interactions, such as user input in the form of a search. The latter is made accessible for search engines in the course of optimization.


An SEO competition or SEO contest is a way of observing and trying out strategies, which is rarely possible with the necessary freedom with productive systems. The participants try to be listed as far up as possible on a search engine with fictitious words. The winner is the website that is in first place on Google or a specified search engine on a given date. The best-known competition terms include schnitzel with potato salad and Hommingberger cheetah trout .

International Markets

In international markets, search optimization techniques are tailored to the dominant search engines in the target market. Both the market share and the competition between the various search engines vary from market to market.

In 2003, Danny Sullivan , an American journalist and technologist, stated that 75% of all searches were done on Google. Outside of the US, Google's market share is larger. This means that Google will remain the leading search engine worldwide from 2007.

Since 2009 there have only been a few markets in which Google is not the leading search optimization engine - there Google is lagging behind a local market leader. Examples are China ( Baidu ), Japan ( Yahoo! Japan ), South Korea ( Naver ), Russia ( Yandex ), and the Czech Republic ( Seznam ).

In 2006, Google's market share in Germany was 85–90%. With a market share of over 95%, Google is by far the dominant search engine in Austria in 2018.

See also

Web links

Wiktionary: Search engine optimization  - explanations of meanings, word origins, synonyms, translations

Individual evidence

  1. ^ Finding What People Want: Experiences with the WebCrawler. April 19, 2019, accessed July 28, 2020 .
  2. Metacrap: Putting the Torch to Seven Straw-men of the meta-utopia
  3. ^ What Is a Tall Poppy Among Web Pages?
  4. Larry Page and Sergey Brin
  5. Is Google Good for You?
  6. ^ The Anatomy of a Large-Scale Hypertextual Web Search Engine
  7. ^ Link Spam Alliances
  8. ^ Google Personalized Search Leaves Google Labs
  9. Google Keeps Tweaking Its Search Engine
  10. 8 Things We Learned About Google PageRank
  11. PageRank Sculpting
  12. Google Loses “Backwards Compatibility” on Paid Link Blocking & PageRank Sculpting
  13. ^ Personalized Search for Everyone
  14. Our New Search Index: Caffeine
  15. Relevance Meets the Real-time Web
  16. Finding More High-quality Sites in Search
  17. Google Webmaster Central Channel: Why doesn't validate? . 2011. Retrieved July 31, 2011.
  18. Cutts, Matt: Google Does Not Use the Keywords Meta Tag in Web Ranking . 2009. Retrieved July 31, 2011.
  19. ↑ Prepare for Mobile First indexing. Google, accessed September 4, 2019 .
  20. Matt Cutts: Should I correct the grammar of comments on my blog? Retrieved September 5, 2019 .
  21. Matt Cutts on Youtube: Does Google Use Data from Social Sites in Ranking?
  22. Dr. Andreas Jaster: factor analysis and spurious correlations. Retrieved September 5, 2019 .
  23. Rundown on Search Ranking Factors
  24. Understanding Search Engine Patents
  25. ^ Karin Lackner, Lisa Schilhan, Christian Kaier, Heinz Wittenbrink: Academic Search Engine Optimization. (PDF) April 4, 2019, accessed June 24, 2020 .
  26. Jöran Beel, Bela Gipp and Erik Wilde: Academic Search Engine Optimization (ASEO): Optimizing Scholarly Literature for Google Scholar and Co. (PDF; 489 kB) Journal of Scholarly Publishing. Pp. 176-190. 2010. Retrieved April 18, 2010.
  27. Heise online: BMW is relaxed about Google's allegations.
  28. Stern : BMW is back.
  29. Jo Bager: Google: Ranking update against SEO spam. In: heise online. April 25, 2012. Retrieved April 23, 2016 .
  30. heise online : Flash in search engines, July 1st, 2008.
  31. Google's Search Console (formerly Webmaster Tools): Flash and other rich media files.
  32. Search Engine Optimizing Europe on
  33. Market shares of the most used search engines in Austria on