Audio injection

from Wikipedia, the free encyclopedia

Audio Injection ( German  audio introduction ) refers to the exploitation of a vulnerability in digital assistants such as Amazon Echo , Google Home or Apple's SIRI caused by lack of authentication originates from user input. The attacker tries to play the wake-up word and commands in the wizard's recording area in order to trigger actions that he intended, but generally undesirable for the legitimate user.

Action

Most digital assistants require an activation word that activates the complete audio evaluation and command recognition of the assistance system. Commands that are recognized in naturally spoken language after the activation  word are executed. The activation word differs from different manufacturers, but is usually the same for all models of the manufacturer and cannot be individualized.

Examples of activation words:

  • "OK Google" - Google Home
  • "Hey SIRI" - Apple SIRI
  • "Alexa" - Amazon Echo
  • "Hey Cortana" - Microsoft Cortana

At the same time, there is no analysis of the perceived voice or alternative form of user authentication with digital assistants. This means that every user who is in the recording area of ​​the digital assistant can trigger actions with full permissions. Depending on the range of functions of the assistant, this ranges from simple actions such as playing music and switching lights to security-critical and sensitive functions such as unlocking locks, executing transactions with costs and reading out private messages.

Attackers can exploit the vulnerability by either noticing or activating the digital assistant and allowing the desired actions to be carried out. For this it is not absolutely necessary for the attacker to be physically in the usual recording area of ​​the digital assistant.

Activation of freely accessible assistants

If a digital assistant is openly accessible, it is easily possible for any attacker to initiate actions.

Activation by expanding the recording area

Digital assistants that are usually not freely accessible can be activated by expanding the usual recording area. This can be done either by overcoming the usual acoustic barriers (e.g. through an open window or ventilation openings), or by reproducing the command at an unusually loud sound using amplifiers and loudspeakers. This makes it possible to control an assistance system from neighboring rooms or adjacent floors.

Activation via loudspeaker / telephone equipment

In principle, digital assistants react to voice commands without being able to distinguish whether speech is spoken directly or reproduced through a loudspeaker device. For this reason, attacks can be activated by almost any device with audio playback through loudspeakers. On the one hand, these can be radio and television sets, but also hands-free telephones, other digital assistants or systems with text-to-speech playback.

Devices that can be independently activated by external participants or have automatic loudspeaker playback are particularly critical here. So z. B. possible for every caller to activate a digital assistant specifically via an answering machine with automatic listening function.

Examples and notable incidents

  • June 2014: A Microsoft TV commercial accidentally activated the Xboxes of viewers via a broadcast voice command.
  • December 2016: YouTube user Adam Jakowenko demonstrates the possibility of mutual activation of virtual assistants via TTS processing.
  • January 2017: In a report for CW6 San Diego, moderator Jim Patton resolved nationwide orders for an inadvertent order by Amazon's digital assistant Alexa through the indirect speech "I love the little girl, saying 'Alexa ordered me a dollhouse," Dollhouses with spectators.

See also

Individual evidence

  1. 13 Jun 2014 at 5:59 pm, Shaun Nichols tweet_btn (): Yet another reason to skip commercials: Microsoft ad TURNS ON your Xbox One. Retrieved January 9, 2017 .
  2. Chris Smith: Google Home talking to Amazon Echo will entertain you all day long. In: BGR. December 1, 2016, accessed January 9, 2017 .
  3. Alex Leininger CNN: How to keep Alexa from buying a dollhouse without your OK. In: CNN. Retrieved January 9, 2017 .