Video Friday: Child Robot Affetto Learning New Facial Expressions
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We'll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):
HRI 2020 - March 23-26, 2020 - Cambridge, U.K. ICARSC 2020 - April 15-17, 2020 - Ponta Delgada, Azores ICRA 2020 - May 31-4, 2020 - Paris, France ICUAS 2020 - June 9-12, 2020 - Athens, Greece CLAWAR 2020 - August 24-26, 2020 - Moscow, RussiaLet us know if you have suggestions for next week, and enjoy today's videos.
We'll have more on the DARPA Subterranean Challenge Urban Circuit next week, but here's a quick compilation from DARPA of some of the competition footage.
[ SubT ]
ABB set up a global competition in 2019 to assess 20 leading AI technology start-ups on how they could approach solutions for 26 real-world picking, packing and sorting challenges. The aim was to understand if AI is mature enough to fully unlock the potential for robotics and automation. ABB was also searching for a technology partner to co-develop robust AI solutions with. Covariant won the challenge by successfully completing each of the 26 challenges; on February 25, ABB and Covariant announced a partnership to bring AI-enabled robotic solutions to market.
We wrote about Covariant and its AI-based robot picking system last month. The most interesting part of the video above is probably the apple picking, where the system has to deal with irregular, shiny, rolling objects. The robot has a hard time picking upside-down apples, and after several failures in a row, it nudges the last one to make it easier to pick up. Impressive! And here's one more video of real-time picking mostly transparent water bottles:
[ Covariant ]
Osaka University's Affetto robot, which we've written about before, is looking somewhat more realistic than when we first wrote about it.
Those are some weird noises that it's making though, right? Affetto, as it turns out, also doesn't like getting poked in its (disembodied) tactile sensor:
They're working on a body for it, too:
[ Osaka University ]
University of Washington students reimagine today's libraries.
[ UW ]
Thanks Elcee!
Astrobee will be getting a hand up on the ISS, from Columbia's ROAM Lab.
I think this will be Astrobee's second hand, in addition to its perching arm. Maybe not designed for bimanual tasks, but still, pretty cool!
[ ROAM Lab ]
In this paper, we tackle the problem of pushing piles of small objects into a desired target set using visual feedback. Unlike conventional single-object manipulation pipelines, which estimate the state of the system parametrized by pose, the underlying physical state of this system is difficult to observe from images. Thus, we take the approach of reasoning directly in the space of images, and acquire the dynamics of visual measurements in order to synthesize a visual-feedback policy.
[ MIT ]
In this project we are exploring ways of interacting with terrain using hardware already present on exploration rovers - wheels! By using wheels for manipulation, we can expand the capabilities of space robots without the need for adding hardware. Nonprehensile terrain manipulation can be used many applications such as removing soil to sample below the surface or making terrain easier to cross for another robot. Watch until the end to see MiniRHex and the rover working together!
Dundee Precious Metals reveals how Exyn's fully autonomous aerial drones are transforming their cavity monitoring systems with increased safety and maximum efficiency.
[ Exyn ]
Thanks Rachel!
Dragonfly is a NASA mission to explore the chemistry and habitability of Saturn's largest moon, Titan. The fourth mission in the New Frontiers line, Dragonfly will send an autonomously-operated rotorcraft to visit dozens of sites on Titan, investigating the moon's surface and shallow subsurface for organic molecules and possible biosignatures.
Dragonfly is scheduled to launch in 2026 and arrive at Titan in 2034.
[ NASA ]
Researchers at the Max Planck Institute for Intelligent Systems in Stuttgart in cooperation with Tampere University in Finland developed a gel-like robot inspired by sea slugs and snails they are able to steer with light. Much like the soft body of these aquatic invertebrates, the bioinspired robot is able to deform easily inside water when exposed to this energy source.
Due to specifically aligned molecules of liquid crystal gels - its building material - and illumination of specific parts of the robot, it is able to crawl, walk, jump, and swim inside water. The scientists see their research project as an inspiration for other roboticists who struggle to design untethered soft robots that are able to move freely in a fluidic environment.
Forests are a very challenging environment for drones, especially if you want to both avoid and map trees at the same time.
[ Kumar Lab ]
Some highlights from the Mohamed Bin Zayed International Robotics Challenge (MBZIRC) that took place in Abu Dhabi, UAE last week.
[ MBZ IRC ]
I never get tired of hearing technical presentations from Skydio, and here's Ryan Kennedy giving at talk at the GRASP Lab.
The technology for intelligent and trustworthy navigation of autonomous UAVs has reached an inflection point to provide transformative gains in capability, efficiency, and safety to major industries. Drones are starting to save lives of first responders, automate dangerous infrastructure inspection, digitize the physical world with millimeter precision, and capture Hollywood quality video - all on affordable consumer hardware.
At Skydio, we have invested five years of R&D in the ability to handle difficult unknown scenarios in real-time based on visual sensing, and shipped two generations of fully autonomous drone. In this talk, I will discuss the close collaboration of geometry, learning, and modeling within our system, our experience putting robots into production, and the challenges still ahead.
[ Skydio ]
This week's CMU RI Seminar comes from Sarjoun Skaff at Bossa Nova Robotics: "Yes, That's a Robot in Your Grocery Store. Now what?"
Retail stores are becoming ground zero for indoor robotics. Fleet of different robots have to coexist with each others and humans every day, navigating safely, coordinating missions, and interacting appropriately with people, all at large scale. For us roboticists, stores are giant labs where we're learning what doesn't work and iterating. If we get it right, it will serve as an example for other industries, and robots will finally become ubiquitous in our lives.
[ CMU RI ]