Filter bubble

from Wikipedia, the free encyclopedia

The filter bubble (English filter bubble ) or information bubble is a term of media studies , by the Internet activist Pariser Eli is used in its book by the 2011th According to Pariser, the filter bubble arises because websites try to algorithmically predict what information the user wants to find - based on the information available about the user ( e.g. user location , search history and click behavior). This results in isolation from information that does not correspond to the user's point of view. The insulating effect of filter bubbles is the subject of scientific research and is not generally considered to be proven.

concept

By using these algorithms, Internet pages tend to show the user only information that corresponds to the previous views of the user. In this way, the user is very effectively isolated in a "bubble" that tends to exclude information that contradicts the previous views of the user.

A prime example of this are Google's personalized search results and Facebook's personalized news stream . In Pariser's opinion, the user is thus less “burdened” by opposing views and thus intellectually isolated in an information bubble.

Pariser gives an example in which one user searched Google with the keyword "BP" and received news about investment opportunities from British Petroleum , while another user with the same search query got information about the oil spill caused by Deepwater Horizon - thus the two searches so brought completely different results. This isolating bubble effect can have negative consequences for the discourse of civil society , believes Pariser. However, there are also opposing opinions that say the effect is minimal and manageable.

Eli Pariser, 2012

The concept of using amplification algorithms to grab the attention of users through selectively filtered information that confirms their views is increased by reinforcing information that, beyond confirmation, arouses fear and anger. Examples are hate speech or conspiracy theories .

personalization

Personalization can be defined as follows:

With web personalization , the content and structure of a web application are adapted to the special needs, goals, interests and preferences of each user. For this one is user model (English user model ) created that captures the assumptions and the information the system has about the user. The system can thus predict what will be relevant for the user. It filters out irrelevant information and thus increases its personal relevance for a user.

According to Pariser, for example, Google uses various “signals” (previous search keywords, location, status updates of contacts on social network sites, etc.) to adapt search results and the advertisements placed ( targeted advertising ) to the user . Facebook, on the other hand, observes the interactions of a user with other users and filters posts from certain users. This means that user activities (click history) are translated into a single user identity and certain information is filtered out based on this identity. Up until 2011, the so-called EdgeRank algorithm was used at Facebook , which was subsequently replaced by a much more complex, machine-learning system.

Paris describes his concept of filter bubble with the slightly more formal description: ". The personal information ecosystem that is created by these algorithms" Other terms were used to describe this phenomenon as "ideological framework" (English. Ideological frames ) or "figurative sphere that surrounds you when you search the Internet".

Recent search history accumulates over time as an internet user shows interest in certain topics by clicking the appropriate links , visiting friends' pages, queuing certain movies, reading selected headlines, and so on. Website operators often use tracking services to collect and analyze this data. B. Google Analytics . Internet companies then use this information to tailor advertising to the needs and tastes of the specific user or to place the appropriate advertising in a more prominent position in the search result.

Pariser's concerns are similar to those expressed by Tim Berners-Lee in 2010 about the hotel California effect (e.g., you can go in but never leave) that occurs when online social networks block and lock out content from other competing sites, in order to have a larger part of the network community in their network. The more you enter, the more locked in and locked into the information within a specific website. It will become a closed “concrete bunker” and there is a risk of fragmentation of the World Wide Web , says Tim Berners-Lee.

For example, Facebook users are in a way “trapped” there forever. If at any point you should decide to leave the network, your user profile will be deactivated but not deleted. All of your personal information and the log of all your activities on Facebook will remain forever on Facebook's servers . You can never leave the Facebook page completely.

In his book The Filter Bubble, Pariser warns that a potential disadvantage of filtered search is that it "excludes us from new ideas, topics, and important information" and "creates the impression that only those things exist that are known to our close self-interest" . In his opinion, this is potentially harmful, both for individuals and for society . He criticizes Google and Facebook for offering “too much candy and not enough carrots” in terms of content. He warns that we are "only exposed to new information to a limited extent" and that our point of view is narrowed by the fact that "invisible algorithms edit the web for editorial purposes". Pariser thinks that the detrimental effects of the filter bubble are also wreaking havoc on society in general, in the sense that it may "undermine civil discourse" and make people more receptive and susceptible to " propaganda and manipulation ".

He writes:

"A world constructed from the known is a world in which there is nothing more to learn ... [because] there is an invisible car propaganda that indoctrinates us with our own ideas ."

- Eli Pariser, The Economist, 2011

Measurement

The variety of available messages and information, which form the size of a filter bubble, is difficult to record and quantify due to individual differences in the amount of news consumption. Therefore, it is first important to define which context of media diversity should be considered more closely. According to Webster and Ksiazek (2012), the focus here can be on one of three areas:

  1. Media, d. H. on the variety of content offered
  2. Individual individuals and their personalized selection of information
  3. The structure and diversity of a readership or audience

It is also important to clarify which methods are used to examine the diversity of messages. In addition to the self-assessment, data collections are used in particular because it is assumed that individuals cannot fully understand the full scope of the news offerings theoretically available to them. In addition to the selected information, these data collections also contain the theoretically available information. This information must first be divided into categories for further analysis, since individual links, images or files are not meaningful. After this classification, conclusions can be drawn about the existing variety of information by means of semantic analysis. It should be noted that the algorithms that are the real reason for personalized information are often not available for direct analysis because the online companies do not publish them.

discussion

There have been conflicting reports about the extent to which personalized filtering is used and whether filtering is beneficial or disadvantageous to the user.

The analyst Jacob Weisberg, who writes for the online magazine Slate , carried out a small, non-representative experiment in 2011 to test Pariser's theory: Five people with different political attitudes searched the Internet for exactly the same search terms. The search result was almost identical for all five people in four search queries. From this he concludes that there is no filter bubble effect and that consequently this theory of the filter bubble is exaggerated, according to which we are all " fed at the feeding trough of the Daily Me ". For his book review, Paul Boutin undertook a similar experiment with people with different search histories. His result was similar to that of Weisberg: almost identical search results. Harvard computer science and law professor Jonathan Zittrain doubts the amount of bias in search results that Google achieves with its personalized filter. He says that the impact of personalized search on the search result is only minor. There are also reports that the user on Google can bypass the personalized search if they want by deleting the search history or using other methods. A Google spokesman says additional algorithms have been incorporated into Google's search engine to "limit personalization and increase the variety of search results." In order to fundamentally protect against tracking by website operators, the user can install plug-ins in the browser . B. Adblock Plus can be used.

In one of the best-known scientific studies on filter bubbles, Bakshy et al. (2015) the effects of personalization by algorithms on the variety of information presented by Facebook users. By working with Facebook, they were able to access a large amount of anonymous data. The results of their investigation show that the selection of the information presented to a Facebook user is only influenced to a small extent by algorithms. Rather, social homophily seems to be at work here: Accordingly, people both online and offline like to surround themselves with people who are similar to themselves. This creates a social network made up of individuals with similar views, attitudes and attitudes. The selective exposure effect, according to which people consciously and unconsciously filter information that they perceive and read at all, also seems to have a major influence.

Nonetheless, there have been reports that Google and other search engine operators have large amounts of information that could enable them to further personalize the user's "Internet experience" in the future if they chose to do so. A report suggests that Google can track the user's previous surfing behavior even if they do not have a personal Google account or are not logged into their Google account. Another report says that Google has tons of collected data - spanning ten years - from various sources such as Gmail , Google Maps and other services offered by Google in addition to the actual search engine. However, this contradicts a report according to which the attempt to personalize the Internet for each user is a great technical challenge for an Internet company - despite the huge amounts of web data available about the user. CNN analyst Doug Gross says filtered search seems more useful to consumers than citizens . It helps consumers who are looking for “pizza” to find local delivery options, appropriately filtering out pizza service providers that are far away . There are consistent reports that sites like the Washington Post , New York Times and others are striving to create personalized information services. These work on the principle of tailoring search results to the user in such a way that he is likely to like them or at least agrees with them.

An article deals more closely with the problem of electronic filters. After that, the user has no influence on the criteria used to filter. The same applies to the signals evaluated by Google during the search: The user does not find out which of these data is being used, nor how he can change it. In addition, there is no transparency whatsoever . The user neither knows how to filter, nor that it is being filtered at all. However, due to the large amount of information on the Internet, filter mechanisms are indispensable. Personalization is seen as the main problem of the electronic filter: The weighting of the information is individually adapted to the user. He does not have the option to switch the filters on or off and to control them according to criteria they have determined themselves. Finally, Pariser demands transparency and user control from the big filters like Google and Facebook. A research group from the University of Delft recommends that developers of filter technologies pay more attention to autonomy and transparency for the user.

Critics consider the thesis of the filter bubble to be a statement from a wrong perspective. Given the flood of information, there is no alternative to filter techniques. Selection of information has always taken place and it is inevitable that other information is not selected. Otherwise, the Internet in particular would make remote discussions easily accessible by opening up digital spaces for them. In addition, the theory is naive, since content is not simply filtered or unfiltered, but is processed, enriched or shifted by many actors in a variety of ways.

Better personalization

Paul Resnick, professor at the University of Michigan , sums up the discussion about the filter bubble as follows: Personalization should not be rated as bad per se. In his view, accurate personalization is less of a concern than not personalizing or personalizing inferior. Filterers have power and therefore responsibility towards the public. The duties of filterers include in particular not to carry out any hidden personalization and not to manipulate personalization unilaterally.

Resnick suggests the following for better personalization:

  • Multi-dimensional preferences:
Topic, location, perspective / ideology, audience, etc. a.
  • Optimize the relationship between research into user interests and preferences and commercial exploitation.
  • Portfolio preferences:
Enable a mixture of challenging and confirmatory information.
  • Delayed preference indicator:
to differentiate between short-term and long-term preferences.
  • Impulse towards long-term preferences:
Entertainment (short term interest) vs. Education (long term interest).
  • Common reference point feature:
Integration of popular topics that the user is otherwise not so interested in.
  • Features that take a perspective:
to understand other people's opinions more easily.

Researchers from the University of Delft deal with ethical questions of personalization and have made the following non-binding proposal on the subject:

Guide to the design of filter algorithms for personalization
  1. Make sure that different identities are possible for each user, which can differ depending on the context.
  2. Design [the filter algorithm] for autonomy so that the user can adapt the filter to his needs and change the identity that was created on the basis of his previous interactions.
  3. Make [the filtering algorithm] transparent so that the user is aware that filtering is taking place. The user must be able to see which criteria are used for filtering and which identity of the user the system is using.

Similar concepts

Relevance paradox

The concept of filter bubble is similar to another phenomenon which as relevance paradox (English paradoxical relevance ) will be described. As a result, people and organizations seek information that is believed to be relevant from the start but then turns out to be useless or of partial interest. In this way, information that is considered irrelevant but is actually useful is not taken into account. The problem arises because the real relevance of a particular fact or concept in such cases becomes apparent after the fact is even known. Before that, the thought of even experiencing a certain fact was discarded due to the false perception of its irrelevance. As a result, the information seeker is trapped in a paradox and he fails to learn things that he really needs. In this way he becomes a victim of his “intellectual blind spot”. The phenomenon of the relevance paradox has appeared in many situations during human intellectual development and is therefore an important topic in science and education. A book called The IRG Solution in 1984 dealt with this problem and suggested general solutions.

Echo chamber effect

A related concept is the echo chamber effect (also echo chamber called English Echo Chamber Effect ) in the Communication Studies , which describes as indicated by the increased virtual working with like-minded people in social networks leads to a narrowing of the world view to confirming errors can result.

Spiral of silence

Elisabeth Noelle-Neumann formulated the concept of the spiral of silence for the political arena . This is understood as the voluntary withholding of one's own opinion if one is of the opinion that it contradicts the majority opinion - which consequently pushes minority opinions more and more back. This effect is reinforced by the role of the media as gatekeepers , since they can pretend a majority opinion based on their own political views (see also Political Attitude of Journalists ).

Trivia

In 2016, “filter bubble” was chosen as word of the year in German-speaking Switzerland.

literature

Web links

Wiktionary: filter bubble  - explanations of meanings, word origins, synonyms, translations

Individual evidence

  1. ^ Eli Pariser: The Filter Bubble: What the Internet Is Hiding from You. Penguin Press, New York, 2011, ISBN 978-1-59420-300-8 .
  2. ^ Judith Moeller, Natali Helberger: Beyond the filter bubble: concepts, myths, evidence and issues for future debates , Report of the University of Amsterdam, June 25, 2018.
  3. a b c Lynn Parramore: The Filter Bubble. In: The Atlantic . Quote: “Google has been personalized for everyone. So when I had two friends this spring Google "BP," one of them got a set of links that was about investment opportunities in BP. The other one got information about the oil spill. "
  4. a b c d e f g h Jacob Weisberg: Bubble Trouble: Is Web personalization turning us into solipsistic twits?
  5. ^ A b c Doug Gross: What the Internet is hiding from you. In: CNN. Quote: “I had friends Google BP when the oil spill was happening. These are two women who were quite similar in a lot of ways. One got a lot of results about the environmental consequences of what was happening and the spill. The other one just got investment information and nothing about the spill at all. "
  6. a b c d e Paul Boutin: Your Results May Vary: Will the information superhighway turn into a cul-de-sac because of automated filters? In: The Wall Street Journal. Quote: "By tracking individual Web browsers with cookies, Google has been able to personalize results even for users who don't create a personal Google account or are not logged into one. ... "
  7. ^ Zhang Yuan Cao, Diarmuid Ó Séaghdha, Daniele Quercia, Tamas Jambor: Auralist: Introducing Serendipity into Music Recommendation. (PDF; 645 kB)
  8. a b c d e Engin Bozdag, Job Timmermans: Values ​​in the filter bubble. Ethics of Personalization Algorithms in Cloud Computing. In: C. Detweiler, A. Pommeranz, J. van den Hoven, H. Nissenbaum (Eds.): Proceedings of the 1st International Workshop on Values ​​in Design. Building Bridges between RE, HCI and Ethics. September 6, 2011, pp. 7-15, accessed September 4, 2011.
  9. Rene Pickhardt: What are the 57 signals google uses to filter search results? ( Memento of the original from April 13, 2013 in the Internet Archive ) Info: The archive link was inserted automatically and has not yet been checked. Please check the original and archive link according to the instructions and then remove this notice. Retrieved September 30, 2011. @1@ 2Template: Webachiv / IABot / www.rene-pickhardt.de
  10. Dirk von Gehlen: How Google and Co. withhold other points of view from us. World without a contrary opinion. In: Süddeutsche. June 28, 2011, accessed September 14, 2011.
  11. Jörg Wittkewitz: The Intuition Pump “Filter Bubble”. In: netzpiloten.de , accessed on September 14, 2011.
  12. a b c d Peter Sennhauser: “Filter bubble”. The new threat of stealth gatekeepers. In: netzwertig.com , accessed on September 14, 2011.
  13. Jeff Widman: EdgeRank. A guide to Facebook's newsfeed algorithm. In: edgerank.net , accessed March 19, 2013.
  14. Roman Tschiedl: The most powerful channel - social to the (algorithmic) Governmentality media with reference to Facebook's news feed. University of Vienna , 2015, p. 102. accessed on May 15, 2015.
  15. a b c Shira Lazar: Algorithms and the Filter Bubble Ruining Your Online Experience? In: Huffington Post. Quote: "a filter bubble is the figurative sphere surrounding you as you search the Internet."
  16. a b Peter Neugebauer: Protection against tracking services that analyze user behavior. In: knowhow.euro-dom.info , accessed on September 30, 2011.
  17. Bianca Bosker: Tim Berners-Lee: Facebook Threatens Web, Beware. In: The Guardian. Quote: "Social networking sites are threatening the Web's core principles ..." Berners-Lee argued. "Each site is a silo, walled off from the others," he explained. "The more you enter, the more you become locked in ...."
  18. a b First Monday: What's on tap this month on TV and in movies and books: The Filter Bubble by Eli Pariser. In: USA Today. Quote: "Pariser explains that feeding us only what is familiar and comfortable to us closes us off to new ideas, subjects and important information."
  19. Note: in Engl. Original text “candy and carrots”, for example “sweets and carrots” as a symbol for a “balanced information diet”: “The best editing gives us a bit of both. It gives us a little bit of Justin Bieber and a little bit of Afghanistan. ... some information vegetables and ... some information dessert. "
  20. Bianca Bosker: Facebook, Google Giving Us Information Junk Food, Eli Pariser Warns. In: Huffington Post. Quote: "When it comes to content, Google and Facebook are offering us too much candy, and not enough carrots."
  21. Eli Pariser: Invisible sieve: Hidden, specially for you. In: The Economist . June 30, 2011; Quote: "Mr Pariser's book provides a survey of the internet's evolution towards personalization, examines how presenting information alters the way in which it is perceived and concludes with prescriptions for bursting the filter bubble that surrounds each user."
  22. Dimitar Nikolov, Diego FM Oliveira, Alessandro Flammini, Filippo Menczer: Measuring online social bubbles . In: PeerJ Computer Science . tape 1 , December 2, 2015, ISSN  2376-5992 , p. e38 , doi : 10.7717 / peerj-cs.38 ( peerj.com [accessed March 23, 2020]).
  23. a b James G. Webster, Thomas B. Ksiazek: The Dynamics of Audience Fragmentation: Public Attention in an Age of Digital Media . In: Journal of Communication . tape 62 , no. 1 , February 1, 2012, ISSN  0021-9916 , p. 39–56 , doi : 10.1111 / j.1460-2466.2011.01616.x ( oup.com [accessed March 23, 2020]).
  24. a b E. Bakshy, S. Messing, LA Adamic: Exposure to ideologically diverse news and opinion on Facebook . In: Science . tape 348 , no. 6239 , June 5, 2015, ISSN  0036-8075 , p. 1130–1132 , doi : 10.1126 / science.aaa1160 ( sciencemag.org [accessed March 23, 2020]).
  25. ^ Pablo Barberá, John T. Jost, Jonathan Nagler, Joshua A. Tucker, Richard Bonneau: Tweeting From Left to Right . In: Psychological Science . tape 26 , no. 10 , August 21, 2015, ISSN  0956-7976 , p. 1531–1542 , doi : 10.1177 / 0956797615594620 (DOI = 10.1177 / 0956797615594620 [accessed March 23, 2020]).
  26. a b Tanja Messingschlager, Peter Holtz: Filter Bubbles and Echo Chambers . In: Markus Appel (ed.): The psychology of the post-factual: About fake news, "Lügenpresse", Clickbait & Co. Springer, Berlin, Heidelberg 2020, ISBN 978-3-662-58695-2 , p. 91-102 , doi : 10.1007 / 978-3-662-58695-2_9 (DOI = 10.1007 / 978-3-662-58695-2_9 [accessed March 23, 2020]).
  27. Eli Pariser: Filter Bubble: How we are incapacitated on the Internet . Hanser, Munich 2012, ISBN 978-3-446-43034-1 .
  28. Note: in the original English text "Daily Me", the term was used by Nicholas Negroponte in Being Digital (Alfred A. Knopf, 1995, ISBN 0-679-43919-6 ) and Cass Sunstein in Republic.com (Princeton University Press, 2002 ) embossed.
  29. Google Personalization on Your Seach Results Plus How to Turn it Off ( Memento of the original from August 17, 2011 in the Internet Archive ) Info: The archive link was automatically inserted and not yet checked. Please check the original and archive link according to the instructions and then remove this notice. ; NGNG; Quote: "Google customizing search results is an automatic feature, but you can shut this feature off." @1@ 2Template: Webachiv / IABot / insightfuldevelopment.com
  30. Andrei Boutyline, Robb Willer: The Social Structure of Political Echo Chambers: Variation in Ideological homophily in Online Networks . In: Political Psychology . tape 38 , no. 3 , 2017, ISSN  1467-9221 , p. 551-569 , doi : 10.1111 / pops.12337 ( wiley.com [accessed March 23, 2020]).
  31. Christoph Kappes: People, Media and Machines - Why the dangers of the »filter bubble« are overestimated. In: Mercury. 03/2012, text in the author's blog
  32. ^ A b Paul Resnick: Personalized Filters Yes: Bubbles No. UMAP slides with notes. In: presnick.people.si.umich.edu , accessed September 9, 2011.
  33. ^ Paul Resnick: Personalized Filters Yes; Bubbles No. In: presnick.livejournal.com , accessed September 9, 2011.
  34. ^ David Andrews: The IRG Solution - Hierarchical Incompetence and how to overcome it. Souvenir Press, London, 1984, ISBN 0-285-62662-0 .
  35. Welcome to the Echo Chamber - Political Debates in the Age of the Internet. In: NachDenkSeiten "The echo chamber describes the phenomenon that many people in social networks tend to surround each other with like-minded people and thereby reinforce each other in their own position. This creates a fatal dynamic in the networks themselves. Fires Through the echo chamber, not only content capable of consensus spreads, but also comments within the networks like wildfire. Whoever meets the consensus of the group best is “shared” and “liked” and receives friend requests from other, harmonious circles. The echo chamber grows and with it the impression that you are not a minority yourself, but a socially relevant majority. " , November 5, 2015, accessed November 2, 2016
  36. Hannah Lühmann: The secret of the echo chamber. In: FAZ . May 6, 2013, accessed November 2, 2016.
  37. Julia Bähr: Who is the majority? In: faz.net , accessed on July 12, 2015.