Article 55CJM Uncovered: 1,000 Phrases that Incorrectly Trigger Alexa, Siri, and Google Assistant

Uncovered: 1,000 Phrases that Incorrectly Trigger Alexa, Siri, and Google Assistant

by
martyb
from SoylentNews on (#55CJM)

upstart writes in with an IRC submission:

Uncovered: 1,000 phrases that incorrectly trigger Alexa, Siri, and Google Assistant:

As Alexa, Google Home, Siri, and other voice assistants have become fixtures in millions of homes, privacy advocates have grown concerned that their near-constant listening to nearby conversations could pose more risk than benefit to users. New research suggests the privacy threat may be greater than previously thought.

The findings demonstrate how common it is for dialog in TV shows and other sources to produce false triggers that cause the devices to turn on, sometimes sending nearby sounds to Amazon, Apple, Google, or other manufacturers. In all, researchers uncovered more than 1,000 word sequences-including those from Game of Thrones, Modern Family, House of Cards, and news broadcasts-that incorrectly trigger the devices.

The devices are intentionally programmed in a somewhat forgiving manner, because they are supposed to be able to understand their humans," one of the researchers, Dorothea Kolossa, said. Therefore, they are more likely to start up once too often rather than not at all."

[...] Examples of words or word sequences that provide false triggers include

  • Alexa: unacceptable," election," and a letter"
  • Google Home: OK, cool," and Okay, who is reading"
  • Siri: a city" and hey jerry"
  • Microsoft Cortana: Montana"

Original Submission

Read more of this story at SoylentNews.

External Content
Source RSS or Atom Feed
Feed Location https://soylentnews.org/index.rss
Feed Title SoylentNews
Feed Link https://soylentnews.org/
Feed Copyright Copyright 2014, SoylentNews
Reply 0 comments