Article 5PBEF With New Roomba j7, iRobot Wants to Understand Our Homes

With New Roomba j7, iRobot Wants to Understand Our Homes

by
Evan Ackerman
from IEEE Spectrum on (#5PBEF)
image.jpg?width=1245&coordinates=0%2C105

The power that computer vision has gained over the last decade or so has been astonishing. Thanks to machine learning techniques applied to large datasets of images and video, it's now much easier for robots to recognize (if not exactly understand) the world around them, and take intelligent (or at least significantly less unintelligent) actions based on what they see. This has empowered sophisticated autonomy in cars, but we haven't yet seen it applied to home robots, mostly because there aren't a lot of home robots around. Except, of course, robot vacuums.

Today, iRobot is announcing the j7, which the company calls its "most thoughtful robot vacuum." The reason they call it that is because in a first for Roombas, the j7 has a front-facing visible light camera along with the hardware and software necessary to identify common floor-level obstacles and react to them in an intelligent way. This enables some useful new capabilities for the j7 in the short term, but it's the long-term potential for a camera-equipped in-home machine-learning platform that we find really intriguing. If, that is, iRobot can manage to make their robots smarter while keeping our data private at the same time.

Here's the new iRobot j7. Note that the j7+ is the version with the automatic dirt dock, but that when we're talking about the robot itself, it's just j7.

Roomba(R) j7 Robot Vacuum Product Overviewwww.youtube.com

Obviously, the big news here on the hardware side is the camera, and we're definitely going to talk about that, especially since it enables software features that are unique to the j7. But iRobot is also releasing a major (and free) software update for all Roombas, called Genius 3.0. A year ago, we spoke with iRobot about their shift from autonomy to human-robot collaboration when it comes to home robot interaction, and and Genius 3.0 adds some useful features based on this philosophy, including:

  • Clean While I'm Away: with your permission, the iRobot app will use your phone's location services to start cleaning when you leave the house, and pause cleaning when you return.
  • Cleaning Time Estimates: Roombas with mapping capability will now estimate how long a job will take them.
  • Quiet Drive: If you ask a Roomba to clean a specific area not adjacent to its dock, it will turn off its vacuum motor on the way there and the way back so as not to bother you more than it has to. For what it's worth, this has been the default behavior for Neato robots for years.

Broadly, this is part of iRobot's push to get people away from using the physical "Clean" button to just tackle every room at once, and to instead have a robot clean more frequently and in more targeted ways, like by vacuuming specific rooms at specific times that make sense within your schedule. This is a complicated thing to try to do, because every human is different, and that means that every home operates differently, leading to the kind of uncertainty that robots tend not to be great at.

"The operating system for the home already exists," iRobot CEO Colin Angle tells us. "It's completely organic, and humans live it every day." Angle is talking about the spoken and unspoken rules that you have in your home. Some of them might be obvious, like whether you wear shoes indoors. Some might be a little less obvious, like which doors tend to stay open and which ones are usually closed, or which lights are on or off and when. Some rules we're acutely aware of, and some are more like established habits that we don't want to change. "Robots, and technology in general, didn't have enough context to follow rules in the home," Angle says. "But that's no longer true, because we know where rooms are, we know what kind of day it is, and we know a lot about what's going on in the home. So, we should take this on, and start building technology that follows house rules."

The reason why it's important for home robots to learn and follow rules this is because they're annoying, and iRobot has data to back this up: "The most lethal thing to a Roomba is a human being annoyed by its noise," Angle tells us. In other words, the most common reason Roombas don't complete jobs is because a human cancels it partway through. iRobot, obviously, would prefer that its robots did not annoy you, and Genius 3.0 is trying to make that happen by finding ways for cleaning to happen in a rule-respecting manner.

"Alignment of expectation is incredibly important-if the robot doesn't do what you expect, you're going to be upset, the robot's going to take the abuse, and we really want to protect the mental well-being of our robots." -Colin Angle

Of course, very few people want to actually program all of these fiddly little human-centric schedules into their Roombas, which is too bad, because that would be the easiest way to solve a very challenging problem: understanding what a human would like a robot to do at any given time. Thanks to mapping and app connectivity, Roombas may have a much better idea of what's going on in the home than they used to, but humans are complicated and our homes and lives are complicated, too. iRobot is expanding ways in which it uses smart home data to influence the operation of its robots. Geofencing to know when you're home or not is one example of this, but it's easy to imagine other ways in which this could work. For instance, if your Roomba is vacuuming, and you get a phone call, it would be nice if the robot was clever enough to pause what it was doing until your call was done, right?

"It's absolutely all about these possibilities," Angle says. "It's about understanding more and more elements. How does your robot know if you're on the phone? What about if someone else is on the phone? Or if the kids are playing on the floor, maybe you don't want your robot to vacuum, but if they're playing but not on the floor, it's okay. Understanding the context of all of that and how it goes together is really where I think the differentiating features will be. But we're starting with what's most important and what will make the biggest change for users, and then we can customize from there."

"Having this idea of house rules, and starting to capture high level preferences as to what your smart home is, how it's supposed to behave, and enabling that with a continuously updating and transferable set of knowledge-we think this is a big, big deal." -Colin Angle

Unfortunately, the possibilities for customization rapidly start to get tricky from a privacy perspective. We'll get to the potential privacy issues with j7's front-facing camera in a little bit, but as we think about ways in which robots could better understand us, it's all about data. The more data that you give a home robot, the better it'll be able to fit into your life, but that might involve some privacy compromises, like sharing your location data, or giving a company access to information about your home, including, with the j7, floor level imagery of wherever you want vacuumed.

image.jpg?width=980

The j7 is not iRobot's first Roomba with a camera. It's also not iRobot's first Roomba with a front-facing sensor. It is iRobot's first Roomba with a front-facing visible light camera, though, which means a lot of things, most of them good.

The flagship feature with the j7 is that it can use its front-facing camera to recognize and react to specific objects in the home. This includes basic stuff like making maps and understanding what kind of room it's in based on what furniture it sees. It also more complicated things. For one, the j7 can identify and avoid headphones and power cords. It can also recognize shoes and socks (things that are most commonly found on floors), plus its own dock. And it can spot pet waste, because there's nothing more unpleasant than a Roomba shoving poo all over a floor that you were hoping to have cleaned.

Getting these object detection algorithms involved a huge amount of training, and iRobot has internally collected and labeled more than a million images from more than a thousand homes around the world. Including, of course, images of poo.

"This is one of those stupid, glorious things. I don't know how many hundreds of models of poo we created out of Play-Doh and paint, and everyone [at iRobot] with a dog was instructed to take pictures whenever their dog pooed. And we actually made synthetic models of poo to try to further grow our database." -Colin Angle

Angle says that iRobot plans to keep adding more and more things that the j7 can recognize; they actually have more than 170 objects that they're working on right now, but just these four (shoes, socks, cords, and poo) are at a point where iRobot is confident enough to deploy the detectors on consumer robots. Cords in particular are impressive, especially when you start to consider how difficult it is to detect a pair of white Apple headphones on a white carpet, or a black power cord running across a carpet with a pattern of black squiggly lines all across it. This, incidentally, is why the j7 has a front LED on it: improving cord detection.

image.jpg?width=980

So far, all of this stuff is done entirely on-robot-the robot is doing object detection internally, as opposed to sending images to the cloud to be identified. But for more advanced behaviors, images do have to leave the robot, which is going to be a (hopefully small) privacy compromise. One advanced behavior is for the robot to send you a picture of an obstacle on the ground and ask you if you'd like to create a keep-out zone around that obstacle. If it's something temporary, like a piece of clothing that you're going to pick up, you'd tell the robot to avoid it this time but vacuum there next time. If it's a power strip, you'd tell the robot to avoid it permanently. iRobot doesn't get to see the pictures that the robot sends you as part of this process, but it does have to travel from the robot through a server and onto your phone, and while it's end-to-end encrypted, that does add a bit of potential risk that Roombas didn't have before.

One way that iRobot is trying to mitigate this privacy risk is to run a separate on-robot human detector. The job of the human detector is to identify images with humans in them, and make sure they get immediately deleted without going anywhere. I asked whether this is simply a face detector, or whether it could also detect (say) someone's butt after they'd just stepped out of the shower, and I was assured that it could recognize and delete human forms as well.

If you're less concerned about privacy and want to help iRobot make your Roomba (and every other Roomba) smarter, these obstacle queries that the robot sends you will also include the option to anonymously share the image with iRobot. This is explicitly opt-in, but iRobot is hoping that people will be willing to participate.

The camera and software is obviously what's most interesting here, but I suppose we can spare a few sentences for the j7's design. A beveled edge around the robot makes it a little better at not getting stuck under things, and the auto-emptying clean base has been rearranged (rotated 90 degrees, in fact) to make it easier to fit under things.

Interestingly, the j7 is not a new flagship Roomba for iRobot-that honor still belongs to the s9, which has a bigger motor and a structured light 3D sensor at the front rather than a visible light camera. Apparently when the s9 was designed, iRobot didn't feel like cameras were quite good enough for what they wanted to do, especially with the s9's D-shape making precision navigation more difficult. But at this point, Angle says that the j7 is smarter, and will do better than the s9 in more complex home environments. I asked him to elaborate a bit:

I believe that the primary sensor for a robot should be a vision system. That doesn't mean that stereo vision isn't cool too, and there might be some things where some 3D range sensing can be helpful as a crutch. But I would tell you that in the autonomous car industry, turn the dial forward enough and you won't have scanning lasers. You'll just have vision. I think [lidar] is going to be necessary for a while, just because the stakes of screwing up with an autonomous driving car are just so high. But I'm saying that the end state of an autonomous driving car is going to be all about vision. And based on the world that Roombas live in, I think the end state of a Roomba is going to be a hundred percent vision sooner than autonomous cars. There's a question of can you extract depth from monocular vision well enough, or do we need to use stereo or something else while we're figuring that out, because ultimately, we want to pick stuff up. We want to manipulate the environment. And having rich 3D models of the world is going to be really important.

IEEE Spectrum: Can you tell me more about picking stuff up and manipulating the environment?

Colin Angle: Nope! I would just say, it's really exciting to watch us get closer to the day where manipulation will make sense in the home, because we're starting to know where stuff is, which is kind of the precursor for manipulation to start making any sense. In my prognostication or prediction mode, I would say that we're certainly within 10 years of seeing the first consumer robots with some kind of manipulation.

IEEE Spectrum: Do you feel like home cleaning robots have already transitioned from being primarily differentiated by better hardware to being primarily differentiated by better software?

Colin Angle: I'm gonna say that we're there, but I don't know whether the consumer realizes that we're there. And so we're in this moment where it's becoming true, and yet it's not generally understood to be true. Software is rapidly growing in its importance and ultimately will become the primary decision point in what kind of robots consumers want.

Finally, I asked Angle about what the capability for collecting camera data in users' homes means long-term. The context here has parallels with autonomous cars: One of the things that enabled the success of autonomous cars was the collection and analysis of massive amounts of data, but we simply don't have ways of collecting in-home data at that scale. Arguably, the j7 is the first camera-equipped mobile robot that's likely to see distribution into homes on any kind of appreciable scale, which could potentially provide an enormous amount of value to a company like iRobot. But can we trust iRobot to handle that data responsibly? Here is what Angle has to say:

The word 'responsibly' is a super important word. A big difference between outside and inside is that the inside of a home is a very private place, it's your sanctuary. A good way for us to really screw this up is to overreach, so we're airing on the side of full disclosure and caution. We've pledged that we'll never sell your data, and we try to retain only the data that are useful and valuable to doing the job that we're doing.

We believe that as we unlock new things that we could do, if we only had the data, we can then generate that data with user permission fairly quickly, because we have one of the largest installed fleets of machine learning capable devices in the world-we're getting close to double digit millions a year of Roombas sold, and that's pretty cool. So I think that there's a way to do this where we are trust-first, and if we can get permission to use data by offering a benefit, we could pretty rapidly grow our data set.

image.jpg?width=980

The iRobot j7 will be available in Europe and the United States within the next week or so for $650. The j7+, which includes the newly redesigned automatic dirt dock, will run you $850. And the Genius 3.0 software should now be available to all Roombas via an app update today.

WZ0GgvgTbOA
External Content
Source RSS or Atom Feed
Feed Location http://feeds.feedburner.com/IeeeSpectrum
Feed Title IEEE Spectrum
Feed Link https://spectrum.ieee.org/
Reply 0 comments