Using Ears, Not Just Eyes, Improves Robot Perception
upstart writes in with an IRC submission:
People rarely use just one sense to understand the world, but robots usually only rely on vision and, increasingly, touch. Carnegie Mellon University researchers find that robot perception could improve markedly by adding another sense: hearing.
In what they say is the first large-scale study of the interactions between sound and robotic action, researchers at CMU's Robotics Institute found that sounds could help a robot differentiate between objects, such as a metal screwdriver and a metal wrench. Hearing also could help robots determine what type of action caused a sound and help them use sounds to predict the physical properties of new objects.
[...] To perform their study, the researchers created a large dataset, simultaneously recording video and audio of 60 common objects -- such as toy blocks, hand tools, shoes, apples and tennis balls -- as they slid or rolled around a tray and crashed into its sides. They have since released this dataset, cataloging 15,000 interactions, for use by other researchers.
The team captured these interactions using an experimental apparatus they called Tilt-Bot -- a square tray attached to the arm of a Sawyer robot. It was an efficient way to build a large dataset; they could place an object in the tray and let Sawyer spend a few hours moving the tray in random directions with varying levels of tilt as cameras and microphones recorded each action.
YouTube video: Tilt-Bot in Action
Read more of this story at SoylentNews.