Article 55X91 Enabling humanoid robot movement with imitation learning and mimicking of animal behaviors

Enabling humanoid robot movement with imitation learning and mimicking of animal behaviors

by
David Riggs
from Crunch Hype on (#55X91)
Rish JoshiContributorRish is an entrepreneur and investor. Previously, he was a VC at Gradient Ventures (Google's AI fund), co-founded a fintech startup building an analytics platform for SEC filings and worked on deep-learning research as a graduate student in computer science at MIT.More posts by this contributor

Over the past two decades, humanoid robots have greatly improved their ability to perform functions like grasping objects and using computer vision to detect things since Honda's release of the ASIMO robot in 2000. Despite these improvements, their ability to walk, jump and perform other complex legged motions as fluidly as humans has continued to be a challenge for roboticists.

In recent years, new advances in robot learning and design are using data and insights from animal behavior to enable legged robots to move in much more human-like ways.

Researchers from Google and UC Berkeley published work earlier this year that showed a robot learning how to walk by mimicking a dog's movements using a technique called imitation learning. Separate work showed a robot successfully learning to walk by itself through trial and error using deep reinforcement learning algorithms.

Imitation learning in particular has been used in robotics for various use cases, such as OpenAI's work in helping a robot grasp objects by imitation, but its use in robot locomotion is new and encouraging. It can enable a robot to take input data generated by an expert performing the actions to be learned, and combine it with deep learning techniques to enable more effective learning of movements.

Much of the recent work using imitation and broader deep learning techniques has involved small-scale robots, and there will be many challenges to overcome to apply the same capabilities to life-size robots, but these advances open new pathways for innovation in improving robot locomotion.

The inspiration from animal behaviors has also extended to robot design, with companies such as Agility Robotics and Boston Dynamics incorporating force modeling techniques and integration of full-body sensors to help their robots more closely mimic how animals execute complex movements.

Techcrunch?d=2mJPEYqXBVI Techcrunch?d=7Q72WNTAKBA Techcrunch?d=yIl2AUoC8zA Techcrunch?i=UPh7DyhaIeY:esSv8bkzUP8:-BT Techcrunch?i=UPh7DyhaIeY:esSv8bkzUP8:D7D Techcrunch?d=qj6IDK7rITsUPh7DyhaIeY
External Content
Source RSS or Atom Feed
Feed Location http://feeds.feedburner.com/TechCrunch/
Feed Title Crunch Hype
Feed Link https://techncruncher.blogspot.com/
Reply 0 comments