Content moderation

from Wikipedia, the free encyclopedia

Under Content Moderation refers to the act on media content in social networks , message boards , Cloud SERVICES, content management systems and other information technology systems by the providers of these services. The tasks of a content moderator include deleting and blocking content ( digital garbage dump ) as well as filtering, organizing and evaluating information and data. Content moderation is part of content management and often also part of online community management . It is used to prevent certain aspects of cyber crime and to remove content that violates the distribution ban.

This is to be distinguished from the criminal investigation work, for example in the Darknet for the purpose of criminal prosecution.

Moderated content

Large Internet service providers such as Facebook , Google , Twitter and YouTube employ so-called "commercial content moderators" to remove or block questionable content. Content to be deleted includes crime , torture , violence , murder and suicide , hate news and cyberbullying , terrorism , animal cruelty , rape and child abuse , among others . It is estimated that there are several hundred thousand content moderators worldwide who are responsible for deleting content. In Germany, there are more than 1000 content moderators working in Berlin and Essen , who check and, if necessary, delete entries that violate Facebook standards or the Network Enforcement Act. However, the work is mostly outsourced by corporations in developing and emerging countries , where people work for low wages .

The activity of the content moderator is increasingly being replaced by artificial intelligence or pre-sorted and reduced by automated systems such as upload filters .

Working conditions of the moderators

The documentary The Cleaners mostly depicts unsafe working conditions and short-term contracts for moderators ( temporary work ) in the Philippines, where most of the content moderators work. It leads to boredom at work because a lot of monotonous activities have to be performed ( boreout syndrome ) and offensive content can also lead to disgust and psychological stress. Content moderators must also be able to react quickly.

The employees must sign a confidentiality agreement and are not allowed to talk about their work. Content moderators find work in PR agencies , call centers , in the companies themselves, special erasure centers or coordination websites such as Mechanical Turk. However, permanent jobs are preferred for legal reasons. Specialized companies in this field are, for example, TaskUS, MicroSourcing and Arvato.

Osh

In 2006, major IT companies in the United States such as Facebook, Microsoft and Twitter formed the Technology Coalition to take action against criminal child pornography and child sexual abuse on the Internet. Among other things, the association founded the Crimes Against Children Conference (CACC) and developed occupational health and safety guidelines ( Employee Resilience Guidebook ) for employees who are confronted with criminal content during their work.

An employee from San Francisco triggered a class action lawsuit against Facebook because she was suffering from post-traumatic stress disorder through her work . Facebook had violated the work safety regulations. The employees are not adequately trained and receive no psychological support. Facebook should set up a fund and cover the costs of their medical treatment.

Internet regulation

Representations of armed conflicts and acts of violence are often deleted or censored. The privatization of control by the major internet platforms is seen by some as a threat to freedom of expression and democracy .

Corporations are accused of shirking their digital responsibility by outsourcing work for low wages in developing and emerging countries and trying to create a make-believe world. Content moderation can also be misused to post fake news , for Internet censorship or for other manipulative purposes .

literature

Web links

Individual evidence

  1. Thomas Ammann: What a horror job - content moderator for 248,000 Facebook users stern.de , May 24, 2017
  2. a b "The Cleaners": The victims of the picture slaughter. In: ZEIT ONLINE. Retrieved September 28, 2018 .
  3. Burcu Gültekin Punsmann: I had to delete violence and agitation for Facebook. In: Süddeutsche Zeitung Magazin. January 5, 2018, accessed September 28, 2018 .
  4. Alexander Becker: Facebook Police: Bertelsmann subsidiary Arvato continues to search for prohibited content for the US platform February 15, 2018
  5. Youtube: Algorithms supposedly replace 180,000 moderators. Retrieved September 28, 2018 (German).
  6. a b The Digital Garbage Collection: Commercial Content Moderation in the Philippines. Retrieved September 28, 2018 (German).
  7. How content moderators filter the content of Google and Facebook. In: trend.at. February 10, 2015, accessed September 28, 2018 .
  8. The Technology Coalition homepage (English)
  9. ^ Employee Resilience Guidebook for Handling Child Sexual Exploitation Images The Technology Coalition, 2015
  10. Publishing THE RAKE: Ex-employee sued Facebook for shock content. In: THE HARKE. Retrieved September 28, 2018 .
  11. A Former Content Moderator Is Suing Facebook Because the Job Reportedly Gave Her PTSD. In: Motherboard. September 24, 2018. Retrieved September 28, 2018 .
  12. Laurence Dodds: Facebook moderator sues after developing PTSD from viewing disturbing content. In: The Telegraph. September 25, 2018. Retrieved September 28, 2018 .
  13. Facebook is being sued by an ex-content moderator who says she got PTSD from exposure to graphic and 'toxic' content on the job. In: Business Insider Germany. Retrieved September 28, 2018 .
  14. Ex-Facebook moderator sues Facebook over exposure to disturbing images. In: Ars Technica. Retrieved September 28, 2018 .
  15. Alexander Fanta: EU Parliament warns of overblocking by internet companies netzpolitik.org, May 3, 2018
  16. Reto Stauffacher: "The Cleaners" on TV: What goes wrong on the Internet | NZZ. In: Neue Zürcher Zeitung. August 27, 2018. Retrieved September 28, 2018 .