Researchers Created AI that Hides your Emotions from Other AI
upstart writes in with a submission, via IRC, for SoyCow2718.
Researchers Create AI that Hides your Emotions from Other AI
Humans can communicate a range of nonverbal emotions, from terrified shrieks to exasperated groans. Voice inflections and cues can communicate subtle feelings, from ecstasy to agony, arousal and disgust. Even when simply speaking, the human voice is stuffed with meaning, and a lot of potential value if you're a company collecting personal data.
Now, researchers at the Imperial College London have used AI to mask the emotional cues in users' voices when they're speaking to internet-connected voice assistants. The idea is to put a "layer" between the user and the cloud their data is uploaded to by automatically converting emotional speech into "normal" speech. They recently published their paper "Emotionless: Privacy-Preserving Speech Analysis for Voice Assistants" on the arXiv preprint server.
Our voices can reveal our confidence and stress levels, physical condition, age, gender, and personal traits. This isn't lost on smart speaker makers, and companies such as Amazon are always working to improve the emotion-detecting abilities of AI.
[...] Their method for masking emotion involves collecting speech, analyzing it, and extracting emotional features from the raw signal. Next, an AI program trains on this signal and replaces the emotional indicators in speech, flattening them. Finally, a voice synthesizer re-generates the normalized speech using the AIs outputs, which gets sent to the cloud. The researchers say that this method reduced emotional identification by 96 percent in an experiment, although speech recognition accuracy decreased, with a word error rate of 35 percent.
[...] Alexa speech engineers unveiled research on using adversarial networks to discern emotion in Amazon's home voice assistants. Emotion, they wrote, "can aid in health monitoring; it can make conversational-AI systems more engaging; and it can provide implicit customer feedback that could help voice agents like Alexa learn from their mistakes."
[...] Amazon's 2017 patent for emotional speech recognition uses illness as an example: "physical conditions such as sore throats and coughs may be determined based at least in part on a voice input from the user, and emotional conditions such as an excited emotional state or a sad emotional state may be determined based at least in part on voice input from a user," the patent says. "A cough or sniffle, or crying, may indicate that the user has a specific physical or emotional abnormality."
A virtual assistant such as Alexa would then use these cues-combined with your browsing and purchase history, according to the patent-to offer you hyper-specific advertisements for medications or other products.
Read more of this story at SoylentNews.