Video Friday: TurtleBot 4
Video Friday is your weekly selection of awesome robotics videos, collected by your friends at IEEE Spectrum robotics. We'll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):
Let us know if you have suggestions for next week, and enjoy today's videos.
We'll have more details on this next week, but there's a new TurtleBot, hooray!
Brought to you by iRobot (providing the base in the form of the new Create 3), Clearpath, and Open Robotics.
[ Clearpath ]
Cognitive Pilot's autonomous tech is now being integrated into production Kirovets K-7M tractors, and they've got big plans: "The third phase of the project envisages a fully self-driving tractor control mode without the need for human involvement. It includes group autonomous operation with a 'leader', the movement of a group of self-driving tractors on non-public roads, the autonomous movement of a robo-tractor paired with a combine harvester not equipped with an autonomous control system, and the use of an expanded set of farm implements with automated control and functionality to monitor their condition during operation."
[ Cognitive Pilot ]
Thanks, Andrey!
Since the start of the year, Opteran has been working incredibly hard to deliver against our technology milestones and we're delighted to share the first video of our technology in action. In the video you can see Hopper, our robot dog (named after Grace Hopper, a pioneer of computer programming) moving around a course using components of Opteran Natural Intelligence, [rather than] a trained deep learning neural net. Our small development kit (housing an FPGA) sat on top of the robot dog guides Hopper, using Opteran See to provide 360 degrees of stabilised vision, and Opteran Sense to sense objects and avoid collisions.
[ Opteran ]
If you weren't paying any attention to the DARPA SubT Challenge and are now afraid to ask about it, here are two recap videos from DARPA.
[ DARPA SubT ]
A new control system, designed by researchers in MIT's Improbable AI Lab and demonstrated using MIT's robotic mini cheetah, enables four-legged robots to traverse across uneven terrain in real-time.
[ MIT ]
Using a mix of 3D-printed plastic and metal parts, a full-scale replica of NASA's Volatiles Investigating Polar Exploration Rover, or VIPER, was built inside a clean room at NASA's Johnson Space Center in Houston. The activity served as a dress rehearsal for the flight version, which is scheduled for assembly in the summer of 2022.
[ NASA ]
What if you could have 100x more information about your industrial sites? Agile mobile robots like Spot bring sensors to your assets in order to collect data and generate critical insights on asset health so you can optimize performance. Dynamic sensing unlocks flexible and reliable data capture for improved site awareness, safety, and efficiency.
[ Boston Dynamics ]
Fish in Washington are getting some help navigating through culverts under roads, thanks to a robot developed by University of Washington students Greg Joyce and Qishi Zhou. "HydroCUB is designed to operate from a distance through a 300-foot-long cable that supplies power to the rover and transmits video back to the operator. The goal is for the Washington State Department of Transportation which proposed the idea, to use the tool to look for vegetation, cracks, debris and other potential 'fish-barriers' in culverts."
[ UW ]
Thanks, Sarah!
NASA's Perseverance Mars rover carries two microphones which are directly recording sounds on the Red Planet, including the Ingenuity helicopter and the rover itself at work. For the very first time, these audio recordings offer a new way to experience the planet. Earth and Mars have different atmospheres, which affects the way sound is heard. Justin Maki, a scientist at NASA's Jet Propulsion Laboratory and Nina Lanza, a scientist at Los Alamos National Laboratory, explain some of the notable audio recorded on Mars in this video.
[ JPL ]
A new kind of fiber developed by researchers at MIT and in Sweden can be made into cloth that senses how much it is being stretched or compressed, and then provides immediate tactile feedback in the form of pressure or vibration. Such fabrics, the team suggests, could be used in garments that help train singers or athletes to better control their breathing, or that help patients recovering from disease or surgery to recover their normal breathing patterns.
[ MIT ]
Partnering with Epitomical, Extend robotic has developed a mobile manipulator and a perception system, to let anyone to operate it intuitively through VR interface, over a wireless network.
[ Extend Robotics ]
Here are a couple of videos from Matei Ciocarlie at the Columbia University ROAM lab talking about embodied intelligence for manipulation.
[ ROAM Lab ]
The AirLab at CMU has been hosting an excellent series on SLAM. You should subscribe to their YouTube channel, but here are a couple of their more recent talks.
Robots as Companions invites Sougwen Chung and Madeline Gannon, two artists and researchers whose practices not only involve various types of robots but actually include them as collaborators and companions, to join Maria Yablonina (Daniels Faculty) in conversation. Through their work, they challenge the notion of a robot as an obedient task execution device, questioning the ethos of robot arms as tools of industrial production and automation, and ask us to consider it as an equal participant in the creative process.
[ UofT ]
These two talks come from the IEEE RAS Seasonal School on Rehabilitation and Assistive Technologies based on Soft Robotics.
[ SofTech-Rehab ]