Article 5588R Uncovered: 1,000 phrases that incorrectly trigger Alexa, Siri, and Google Assistant

Uncovered: 1,000 phrases that incorrectly trigger Alexa, Siri, and Google Assistant

by
Dan Goodin
from Ars Technica - All content on (#5588R)
game-of-thrones-alexa-800x450.jpg

Enlarge (credit: Schonherr et al.)

As Alexa, Google Home, Siri, and other voice assistants have become fixtures in millions of homes, privacy advocates have grown concerned that their near-constant listening to nearby conversations could pose more risk than benefit to users. New research suggests the privacy threat may be greater than previously thought.

The findings demonstrate how common it is for dialog in TV shows and other sources to produce false triggers that cause the devices to turn on, sometimes sending nearby sounds to Amazon, Apple, Google, or other manufacturers. In all, researchers uncovered more than 1,000 word sequences-including those from Game of Thrones, Modern Family, House of Cards, and news broadcasts-that incorrectly trigger the devices.

The devices are intentionally programmed in a somewhat forgiving manner, because they are supposed to be able to understand their humans," one of the researchers, Dorothea Kolossa, said. Therefore, they are more likely to start up once too often rather than not at all."

Read 8 remaining paragraphs | Comments

index?i=w0NG1AO723I:4ne3VVytvog:V_sGLiPB index?i=w0NG1AO723I:4ne3VVytvog:F7zBnMyn index?d=qj6IDK7rITs index?d=yIl2AUoC8zA
External Content
Source RSS or Atom Feed
Feed Location http://feeds.arstechnica.com/arstechnica/index
Feed Title Ars Technica - All content
Feed Link https://arstechnica.com/
Reply 0 comments