Article 6KPHB A Robot That Can Anticipate When a Human Is Going to Smile and Then Mimic Their Facial Expressions

A Robot That Can Anticipate When a Human Is Going to Smile and Then Mimic Their Facial Expressions

by
Lori Dorn
from Laughing Squid on (#6KPHB)
Story Image

Researcher Yuhang Hu, and other researchers who work with professor Hod Lipson at the Creative Machines Lab at Columbia University, have developed a machine learning robot named Emo that has facial co-expression skills such as making eye contact, mimicking a human smile as or before it happens, and making other facial expressions.

Emo is a human-like head with a face that is equipped with 26 actuators that enable a broad range of nuanced facial expressions. The head is covered with a soft silicone skin with a magnetic attachment system, allowing for easy customization and quick maintenance. For more lifelike interactions, the researchers integrated high-resolution cameras within the pupil of each eye, enabling Emo to make eye contact, crucial for nonverbal communication

Emo Was Trained to Perfect These Gestures With a Mirror

To train the robot how to make facial expressions, the researchers put Emo in front of the camera and let it do random movements. After a few hours, the robot learned the relationship between their facial expressions and the motor commands - much the way humans practice facial expressions by looking in the mirror. This is what the team calls self modeling"

Here's More Information About the Project
External Content
Source RSS or Atom Feed
Feed Location http://laughingsquid.com/feed/
Feed Title Laughing Squid
Feed Link https://laughingsquid.com/
Reply 0 comments