Deepfake
Deepfakes ( English: suitcase or portemanteau word made up of the terms “ deep learning ” and “ fake ”) describe media content (photo, audio and video) that appear realistic and that have been modified and falsified using artificial intelligence techniques . Although media manipulation is not a new phenomenon, deepfakes use machine learning methods , more precisely artificial neural networks , in order to generate forgeries largely autonomously.
development
The first and currently most frequent use of deepfakes takes place in the area of "face swapping". Here, in visual material (e.g. videos or photos), the face of a person is swapped with a generated face of another person in order to represent a target person in a different context. The resulting content has great destructive potential, such as fake pornographic content. Deepfakes, however, go far beyond the use of “face-swapping” and include the manipulation of auditory content (e.g. “voice swapping”) and the transfer of body movements to other people in video material, known as “body-puppetry”.
Although deepfakes are a relatively new phenomenon (first use of the term in 2017), they have sparked a broad debate about their use and the dangers for politics, business and society. There are efforts on the part of politics and industry to make it easier to identify deepfakes, to restrict their use and to make their unauthorized creation a criminal offense.
application
art
In art, deepfakes are used as a means of exaggeration, for example to make fun of pop culture things. The artist Joseph Ayerle used Deepfake for the short film Un'emozione per semper 2.0 to stage a journey through time for the protagonist Ornella Muti from 1978 to 2018. In April 2018, American cabaret artist Jordan Peele and BuzzFeed published a deepfake video in which he warned Barack Obama of the dangers of manipulated videos. In this video, the “synthetic” ex-president describes his successor Donald Trump as a “complete idiot”. A YouTuber, who appears under the name Speaking of AI, in turn let the actor known as Kylo Ren from Star Wars speak the intro to the original TV series Star Trek .
In June 2019, the DeepNude program for Windows and Linux was released, which uses neural networks to remove clothes from pictures of women. There was a free and a paid version for 50 USD. However, the software was removed shortly afterwards by the creators and the buyers received a refund.
politics
The MIT Center for Advanced Virtuality produced a complex deepfake in which the technology was used to both duplicate the face and voice of former US President Richard Nixon. The creators had Nixon read a speech about the death of the Apollo 11 astronauts. This was once written but was never held public due to the successful moon landing.
Deepfakes are used to misrepresent politicians on video portals or image boards . For example, the face of the Argentine President Mauricio Macri has been replaced by that of Adolf Hitler or that of Angela Merkel by that of Donald Trump .
privacy
Since 2018, university and industry research has been conducted into using generated faces to protect identity. After the natural alienation of the face, data recorded in public areas can be used for analysis purposes or for machine learning like original data, but with greater privacy protection and the exclusion of biometric recognition. Places of use are, for example, for the development of autonomous vehicles or for smart city cameras. Artificial faces or other synthetic features represent an alternative form of anonymization to the blurring of personal data .
research
In computer science , mathematics, and the arts , machine learning algorithms dealing with the generation of such media files are an area of research and development .
Deepfakes generated a great media and social response. Several journalists are therefore conducting self-experiments to examine how easy or difficult it is to create deepfakes and how they could affect the credibility of video images.
pornography
In the fall of 2017, an anonymous Reddit user posted several porn videos on the Internet under the pseudonym "Deepfakes" . So were the Wonder Woman actress Gal Gadot having sex with her stepbrother and other actresses like Emma Watson , Katy Perry , Taylor Swift and Scarlett Johansson to see. These scenes were not real, but created using artificial intelligence and quickly exposed. The Reddit user had used scenes from sex films and feature films with the actress as training data for his neural network. According to a report in Vice magazine in December 2017 , the topic was picked up by other media. In the weeks that followed, other Reddit users in the developer community revised and improved the approach taken by the first developer, making it increasingly difficult to distinguish fake content from real content. On the subreddit "Deepfakes" founded on Reddit, more than 50,000 users exchanged deepfake videos until it was blocked on February 7, 2018. A revenge porn can result from the use of Deep fakes. The porn site Pornhub also stated that it would block deepfake porn in the future.
software
DeepFaceLab
The open source software DeepFaceLab is the most widely used program for creating deepfakes. According to the website Github, more than 95% of all deepfakes are created with DeepFaceLab. Various tools are made available which enable the user to efficiently create a deepfake. There are versions of DeepFaceLab that are optimized for different hardware, with the one for Nvidia graphics cards being the most efficient. A tool makes it possible to extract faces from video files using artificial intelligence and to determine important points such as eyes, nose and mouth. A deepfake can then be created by breaking down the images of the target and source persons into individual parts. The artificial neural network then tries to reassemble the image. This result is compared with the original image and the weightings of the connections in the artificial neural network are then modified. Detailed instructions on how to use this software are freely available.
Problems
Amounted to
Deepfakes of votes enabled scammers to gain the trust of people and thus to make a request that was supposedly made by a trustworthy person. In 2019, a CEO of a company from the United Kingdom was asked to transfer 220'00 euros to a Hungarian bank account via a telephone call, using a deepfake of a voice from a person he knew and possessed this competence.
Creditworthiness and authenticity
Although it has long been easy to falsify images, it is much more difficult to falsify videos, which is why some companies require videos from their customers for verification purposes. Hackers were able to pass this verification using deepfakes.
Professor Hao Li of the University of Southern California warned that unless the technology of deepfakes is exposed to the masses, the misuse of deepfakes for fake news or the like could pose a serious threat.
Others
- At the end of January 2018, an app called FakeApp was launched , with which users can easily swap faces in videos as if someone were to be seen in the video ( face swap ). An algorithm uses an artificial neural network and the performance of the graphics processor as well as three to four gigabytes of hard disk space to calculate the fake video to be generated. For precise information, the program needs a lot of image material from the person to be inserted in order to use the mentioned deep learning algorithm on the basis of the video sequences and images to learn which image aspects need to be exchanged. The software uses the TensorFlow AI framework from Google , which has already been used for the DeepDream program, among other things . Predominantly celebrities are affected by such fake sex videos, although private individuals can be the target.
- Civil or criminal proceedings are possible, although the content is not genuine (such as for violation of personal rights and sexual insult). The prerequisite for this is that the perpetrator can be identified. This is not always possible on the Internet due to anonymity and pseudonymity as well as mutual reaching back and forth.
- The AI researcher Alex Champandard believes a traditional high level of trust in journalism and the media is one reason for the success of deepfakes. Google engineer Ian Goodfellow describes it as a historic godsend that we “could previously rely on videos as evidence of facts”.
- In May 2019, researchers at the Samsung laboratory in Moscow succeeded in developing an AI system that is able to create a fake video from a single image. They created three different videos of a speaking and moving woman from the portrait of the Mona Lisa .
- The website notjordanpeterson.com used the technology of "deepfakes" to let the Canadian psychologist Jordan Peterson , who gained some fame on the Internet through controversial statements and interviews, appear to say any sentences in a text field, whereby the algorithm used sounded relatively realistic Delivered results. The functionality of the website was initially deactivated after the psychologist threatened legal action.
- The website thispersondoesnotexist.com generates faces that don't even exist. These faces were entirely software generated.
Web links
- Heise online , January 25, 2018, Fabian A. Scherschel: Deepfakes: Neural networks create fake porn and Hitler parodies
- The Wall Street Journal , YouTube , October 15, 2018: Deepfake Videos Are Getting Real and That's a Problem ("Deepfake videos are getting real and that's a problem")
- DeepFaceLab on Github
Individual evidence
- ↑ a b AI-Assisted Fake Porn Is Here and We're All Fucked . In: Motherboard . December 11, 2017 ( motherboard.vice.com [accessed February 8, 2018]).
- ↑ a b Jan Kietzmann, Linda W. Lee, Ian P. McCarthy, Tim C. Kietzmann: Deepfakes: Trick or treat? In: Business Horizons . tape 63 , no. 2 , 2020, ISSN 0007-6813 , p. 135–146 , doi : 10.1016 / j.bushor.2019.11.006 ( sciencedirect.com ).
- ^ Prepare, Don't Panic: Synthetic Media and Deepfakes. In: WITNESS Media Lab. Retrieved February 4, 2020 (American English).
- ↑ Abhimanyu Ghoshal: Twitter, Pornhub and other platforms ban AI-generated celebrity porn. February 7, 2018, Retrieved February 4, 2020 (American English).
- ↑ Yvette D. Clarke: HR3230 - 116th Congress (2019-2020): Defending Each and Every Person from False Appearances by Keeping Exploitation Subject to Accountability Act of 2019. June 28, 2019, accessed February 4, 2020 .
- ^ Co-Creation Studio at MIT Open Documentary Lab: Can Humans Co-Create with Algorithms? June 11, 2019, accessed on January 29, 2020 .
- ↑ Frankfurter Rundschau: Barack Obama: Fake Obama insults Donald Trump. Accessed April 23, 2018 (German).
- ↑ Is Obama going crazy? This video shows one of the greatest threats to democracies . In: bz Basellandschaftliche Zeitung . ( basellandschaftlichezeitung.ch [accessed on April 30, 2018]).
- ↑ Speaking of AI: Kylo Ren AI Voice in Star Trek. Retrieved June 18, 2020 .
- ↑ Samantha Cole: This Horrifying App Undresses a Photo of Any Woman With a Single Click. In: Vice. June 26, 2019, accessed on May 17, 2020 .
- ↑ Joseph Cox: GitHub Removed Open Source Versions of DeepNude. In: Vice. July 9, 2019, accessed May 17, 2020 .
- ↑ deepnudeapp: pic.twitter.com/8uJKBQTZ0o. In: @deepnudeapp. June 27, 2019, accessed May 17, 2020 .
- ↑ Deepfake Art Project Reimagines Nixon's Speech Had Apollo 11 Gone Horribly Wrong. Retrieved March 6, 2020 (American English).
- ↑ a b c When Merkel suddenly bears Trump's face: the dangerous manipulation of images and videos . In: az Aargauer Zeitung . February 3, 2018 ( aargauerzeitung.ch [accessed February 8, 2018]).
- ↑ Patrick Gensing, tagesschau.de: Deepfakes: On the way to an alternative reality? Retrieved on February 27, 2018 (German).
- ↑ Xiuye Gu Weixin Luo, Michael S. Ryoo, Yong Jae Lee: Password-conditioned Anonymization and Deanonymization with Facebook Identity Transformers . In: Computer Science . November 29, 2019, arxiv : 1911.11759 .
- ↑ Oran Gafni, Lior Wolf, Yaniv Taigman: Live Face De-Identification in Video . In: Computer Science . November 19, 2019, arxiv : 1911.08348 .
- ↑ 5 Areas Where AI Will Turbocharge Privacy Readiness. Accessed June 17, 2020 (English).
- ↑ D-ID, the Israeli company that digitally de-identifies faces in videos and still images, raises $ 13.5 million. In: TechCrunch. Retrieved June 17, 2020 (American English).
- ↑ Pilot project in Stuttgart: With cameras against Corona: Deutsche Bahn wants to steer passenger flows more consciously. Retrieved June 17, 2020 .
- ↑ K. Hayduk: [Drug therapy of hypertension] . In: Journal of General Medicine . tape 51 , no. 33 , November 30, 1975, ISSN 0300-8673 , p. 1536-1542 , PMID 1909 .
- ↑ How I learned to produce my own deepfakes in just two weeks. January 26, 2020, accessed March 6, 2020 .
- ↑ Deepfakes: Can you still believe your eyes? December 3, 2019, accessed March 6, 2020 .
- ↑ a b Markus Böhm: "Deepfakes": Companies take action against fake celebrity porn . In: Spiegel Online . February 7, 2018 ( spiegel.de [accessed February 8, 2018]).
- ↑ barbara.wimmer: Deepfakes: Reddit deletes forum for artificially generated fake porn . ( futurezone.at [accessed on February 8, 2018]).
- ↑ heise online: Deepfakes: Reddit also bans fake porn. Retrieved February 8, 2018 .
- ↑ Reddit bans deepfake porn. Retrieved February 8, 2018 .
- ↑ Samantha Cole: Pornhub Is Banning AI-Generated Fake Porn Videos, Says They're Nonconsensual. In: Vice. February 6, 2018, accessed May 20, 2020 .
- ↑ iperov: iperov / DeepFaceLab. May 22, 2020, accessed May 22, 2020 .
- ↑ iperov: iperov / DeepFaceLab. May 17, 2020, accessed May 17, 2020 .
- ↑ Nick Statt: Thieves are now using AI deepfakes to trick companies into sending them money. September 5, 2019, accessed on May 17, 2020 .
- ↑ Jesse Damiani: A Voice Deepfake Was Used To Scam A CEO Out Of $ 243,000. Retrieved May 17, 2020 (English).
- ↑ When Merkel suddenly bears Trump's face: the dangerous manipulation of pictures and videos. Retrieved on May 17, 2020 (Swiss Standard German).
- ↑ Perfect Deepfake Tech Could Arrive Sooner Than Expected. Retrieved May 17, 2020 (English).
- ↑ a b Britta Bauchmüller: "Fake-App": With this program everyone can end up in porn - whether they want to or not! In: Berliner-Kurier.de . ( berliner-kurier.de [accessed on February 8, 2018]).
- ↑ Eike Kühl: Artificial Intelligence: Fake News is followed by Fake Porn . In: The time . January 26, 2018, ISSN 0044-2070 ( zeit.de [accessed February 8, 2018]).
- ↑ heise online: Deepfakes: Neural networks create fake porn and Hitler parodies. Accessed February 8, 2018 (German).
- ↑ Artificial Intelligence: Selfies are a good source . In: The time . January 26, 2018, ISSN 0044-2070 ( zeit.de [accessed February 8, 2018]).
- ↑ Fake revenge porn: Deepfakes are becoming a real problem. Retrieved February 8, 2018 (Austrian German).
- ↑ Samsung deepfake AI could fabricate a video of you from a single profile pic cnet.com , May 24, 2019, accessed May 31, 2019.
- ↑ Mix: This AI lets you generate eerily realistic Jordan Peterson sound bites. August 16, 2019. Retrieved August 29, 2019 (American English).
- ↑ Jordan Peterson deepfake voice simulator taken offline after Peterson suggests legal action. In: Reclaim The Net. August 22, 2019. Retrieved August 29, 2019 (American English).