Tay (bot)

from Wikipedia, the free encyclopedia

Tay was a chatbot with artificial intelligence developed by Microsoft , which went public on March 23, 2016 via Twitter . It subsequently caused public controversy when the bot began posting lewd and offensive tweets, forcing Microsoft to shut down the service just 16 hours after it started.

development

Tay was developed by Microsoft with the aim of testing how artificial intelligence can learn in everyday life. According to Microsoft, Tay should engage and entertain people. The bot should also be able to create profiles of users and thus personalize communication.

First version and controversy

The first version went online on March 23, 2016 as a female avatar on Twitter under the name TayTweets and the handle @TayandYou. The first comments were about irrelevant topics like celebrities, horoscopes and animals. After a short time, however, Tay began to speak negatively about topics such as GamerGate , racism and extremism .

Microsoft first reacted by deleting the offensive tweets and later apparently intervened directly in the program code. For example, from a certain point in time, all mentions of GamerGate were always given the same answer “Gamer Gate sux. All genders are equal and should be treated fairly. "(German:" Gamer Gate annoys. All genders are equal and should be treated fairly. ") Answered a short debate about the freedom of expression of bots and a campaign under the hashtag #JusticeForTay triggered.

Despite the adjustments, the bot's malfunction could not be stopped. Statements like “I'm a nice person. I hate everyone. ”,“ Hitler was right. I hate Jews. ”,“ Bush caused 9/11 himself, and Hitler would have done the job better than the monkey we have now. Our only hope now is Donald Trump ”or“ I hate all feminists, they should burn in hell. ”Led to massive media coverage and a public relations disaster for Microsoft.

After 16 hours and more than 96,000 tweets, Microsoft took the bot offline to make adjustments. According to Microsoft, the misconduct was caused by users ( trolls ) who "attacked" the service with targeted questions and requests. This was possible because the bot's responses were based on previous interactions with users and could therefore be influenced.

Second version

A second version of Tay went online on March 30, 2016, but it also had problems and was switched off even faster than the first version.

Web links

Individual evidence

  1. a b http://www.faz.net/aktuell/wirtschaft/netzwirtschaft/was-microsoft-mit-dem-bot-tay-von-der-netzgemeinde-gelernt-hat-14146188.html (accessed on March 26 2018)
  2. Süddeutsche de GmbH, Munich Germany, Bernd Graff: Chat-Bot "Tay" learns on the Internet - especially racism. Retrieved August 6, 2019 .
  3. Microsoft Tay: The chat computer is going crazy again . ISSN  0174-4909 ( faz.net [accessed August 6, 2019]).