Article 3D1ZD Adversarial examples: attack can imperceptibly alter any sound (or silence), embedding speech that only voice-assistants will hear

Adversarial examples: attack can imperceptibly alter any sound (or silence), embedding speech that only voice-assistants will hear

by
Cory Doctorow
from on (#3D1ZD)

494891-alexa-tell-me-some-amazon-echo-ti

Adversarial examples have torn into the robustness of machine-vision systems: it turns out that changing even a single well-placed pixel can confound otherwise reliable classifiers, and with the right tricks they can be made to reliably misclassify one thing as another or fail to notice an object altogether. But even as vision systems were falling to adversarial examples, audio systems remained stubbornly hard to fool, until now. (more")

External Content
Source RSS or Atom Feed
Feed Location http://boingboing.net/rss
Feed Title
Feed Link http://boingboing.net/
Reply 0 comments