Article 4RR29 How and Why Companies Will Engineer Your Emotions

How and Why Companies Will Engineer Your Emotions

by
Richard Johnson
from IEEE Spectrum on (#4RR29)

This is a guest post. The views expressed here are solely those of the author and do not represent positions of IEEE Spectrum or the IEEE.

Technology has become more physically and psychologically intimate, which has created a demand for new technologies that can infer emotional states from humans. The term "affective computing" was coined in 1995 by Professor Rosalind Picard, founder and director of the affective computing research group at the MIT Media Lab. She recognized the extent to which emotions governed our lives and decided to drive forward the concept of "engineering emotion."

What is affective computing?

Affective computing systems are being developed to recognize, interpret, and process human experiences and emotions. They all rely on extensive human behavioral data, captured by various kinds of hardware and processed by an array of sophisticated machine learning software applications.

AI-based software lies at the heart of each system's ability to interpret and act on users' emotional cues. These systems identify and link nuances in behavioral data with the associated emotion.

The most obvious types of hardware for collecting behavior data are cameras and other scanning devices that monitor facial expressions, eye movements, gestures, and postures. This data can be processed to identify subtle micro expressions that a human assessment might struggle to identify consistently.

What's more, high-end audio equipment records variances and textures in users' voices. Some insurance companies are experimenting with call voice analytics that can detect if someone is lying to their claim handers. The team working on IBM's question-answering computer system Watson has developed a "tone analyzer" that uses linguistic analysis to detect three types of tones from text: language style, emotion, and social tendencies.

Virtual reality gear-head-mounted displays, for example-is being developed to create increasingly realistic simulated experiences. The technology enables the game to adapt based on the emotion of the user, creating a more personal and exciting simulated experience.

How is affective computing used by companies?

Many companies are looking to affective computing to capture mass data about consumer reactions to their advertising campaigns. One of the most notable innovators in the retail space, Realeyes, works with big-name brands such as Coca-Cola, Expedia, Mars, AT&T, and LG, who deploy the technology to help them measure, optimize, and compare the effectiveness of their content.

In a recent interview, affective computing pioneer Rosalind Picard revealed that she has lost out on a significant amount of money for not selling products to companies that planned to collect data without people's consent.

The Realeyes software measures viewers' emotions and attention levels using webcams. It can show a brand's content to panels of consenting consumers all around the world and measure how audiences respond to a campaign by monitoring their attention levels and logging moments of maximum engagement. Marketers are provided with an overall score based on attention and emotional engagement, which enables them to compare multiple assets or benchmark them against previous campaigns.

Microsoft's Human Understanding and Empathy team is working on various projects with the aim of implementing affective computing into their products. This includes developing a multimodal emotion-sensing platform that combines computer vision analysis of facial expression and body pose with audio processing that detects speech and sentiment. Together, they enable the system to generate computational models of conversation that better reflect emotions.

What are the ethical issues in affective computing?

There's inevitable fear and uncertainty surrounding developments in AI technology, and affective computing is no different. Marketing companies may have trouble gathering large amounts of personal data from audiences, as they'll have to make sure that all participants have consented. This would make it difficult to gather information about everyday advertising to the masses.

Yet it's easy to imagine that, one day soon, TV sets will have cameras and microphones that can pick up reactions to shows and commercials, and that those reactions will be monitored by the media industry. This possibility creates huge privacy and data protection issues, and it's the biggest obstacle for companies.

Affective computing pioneer Picard is fervently against the use of affective computing for unethical purposes. In a recent interview, she said that she has lost out on a significant amount of money for not selling products to companies that planned to collect data without people's consent.

Picard's hopes for affective computing lie within its ability to help people communicate better. For instance, affective computing can help people with autism communicate the emotions they struggle to vocalize. Some years back, her research group made a glove that had sensors on the palm to monitor emotional responses. The glove could recognize when the person was experiencing moments of frustration, which could potentially help prevent emotional distress. This same device could be used to monitor stress during a child's school day.

Within the Black Dog Institute, an Australian mental-health clinical services group, and ReachOut, a similar group that aims to help young people, there is considerable interest in harnessing affective computing for a range of social benefits. Slawomir Nasuto, professor of cybernetics at the University of Reading in the UK, believes these technologies could be easily integrated with public sector infrastructures.

Nasuto envisions the introduction of computerized tutoring within schools. The technology could be used to recognize the mental state of the students, including stress or attention, that signify whether the learners are struggling, interested, or bored. On the basis of that input, the system could adjust the difficulty of the problem, style of explanation, or pace of delivery to keep the students engaged.

Nasuto has also explored how a similar system could be deployed in hospitals' intensive-care wards. Patients undergoing operations (particularly after a traumatic event) are highly stressed and face potential cognitive impairment post-operation. He says that "non-pharmacological interventions such as music may help to reduce the levels of anxiety that the patient is suffering. That, in turn, may enable the clinicians to lower the doses of medications that the patient is receiving."

Affective computing has faced, and will continue to face, aversion from those who question the intentions behind its use. However, if used safely and ethically, affective computing may become a significant part of our everyday lives.

As Nasuto says: "Affective computing is a tool-and any tool can be used for either good or nefarious purposes. What is specific here is the pervasiveness of affective computing via online connectivity and the emergence of cheaper and more networked sensing technologies. Together, they will open the way for the collection of unprecedented volumes of date on the human state."

Richard Johnson is a partner and European patent attorney at the IP firm Mewburn Ellis, where he works with the firm's electronics, computing, physics, and engineering patent teams. He has a particular interest in the patentability of software and business-related inventions. Johnson advises clients in the United Kingdom and abroad on the development and management of patent portfolios.

2h-oodmNwaM
External Content
Source RSS or Atom Feed
Feed Location http://feeds.feedburner.com/IeeeSpectrum
Feed Title IEEE Spectrum
Feed Link https://spectrum.ieee.org/
Reply 0 comments