Robot Vacuums That Collect Photos, Videos And Audio Of Users To Train AI Models Start Yelling Obscenities And Chasing Dogs
As far back as 2017, Techdirt was warning that robot vacuum cleaners represented a threat to privacy. In that instance, it concerned the possibility that iRobot, makers of the robot vacuum Roomba, might sell the data that its device collected about the size and layout of a home. Five years later, it was becoming clear that a new danger was emerging because robot vacuums were starting to incorporate cameras. These are the kind of images a system could gather, as reported by MIT Technology Review in 2022:
The photos vary in type and in sensitivity. The most intimate image we saw was the series of video stills featuring the young woman on the toilet, her face blocked in the lead image but unobscured in the grainy scroll of shots below. In another image, a boy who appears to be eight or nine years old, and whose face is clearly visible, is sprawled on his stomach across a hallway floor. A triangular flop of hair spills across his forehead as he stares, with apparent amusement, at the object recording him from just below eye level.
At that point, the images were taken by a development version, not a consumer product. They were then sent to Scale AI, a startup that used contract workers to label data for companies like iRobot. But the MIT Technology Review article presciently noted:
Ultimately, though, this set of images represents something bigger than any one individual company's actions. They speak to the widespread, and growing, practice of sharing potentially sensitive data to train algorithms
Today, this kind of data is needed not to tweak the odd algorithm or two, as was the case a few years back, but to train powerful, large-scale AI systems. The current AI frenzy has led to a desperate race to gather and use as much training data as possible. A story on the Australian ABC News site shows that this includes the latest generation of robot vacuums:
Ecovacs robot vacuums, which have been found to suffer from critical cybersecurity flaws, are collecting photos, videos and voice recordings - taken inside customers' houses - to train the company's AI models.
The Chinese home robotics company, which sells a range of popular Deebot models in Australia, said its users are willingly participating" in a product improvement program.
When users opt into this program through the Ecovacs smartphone app, they are not told what data will be collected, only that it will help us strengthen the improvement of product functions and attached quality".
According to the ABC News article, users are instructed to click above" in order to find out the details of what data is collected and how it is used, but that - unhelpfully - there is no link available on the page. Ecovacs's general privacy policy allows for the blanket collection of user data for research purposes", including:
- The 2D or 3D map of the user's house generated by the device
- Voice recordings from the device's microphone
- Photos or videos recorded by the device's camera
Ecovacs confirmed to ABC News that data from users who had opted into its product improvement program was being used for training its AI model:
During this data collection, we anonymise user information at the machine level, ensuring that only the anonymised data is uploaded to our servers," the spokesperson said in a statement.
We have implemented strict access management protocols for viewing and utilising this anonymised user data."
Well, that's what they all say. But supposedly anonymized data can often be de-anonymized, and strict access management protocols" can be abused or circumvented. In fact, another article on the ABC News site shows how it was possible to gain remote access to an Ecovacs robot vacuum and watch live images from its camera. It was also possible to access all the unit's logs, Wi-Fi credentials and sensors, the report claimed. More recently, it seems that hackers have been taking advantage of these security weaknesses. According to ABC News again:
Robot vacuums in multiple US cities were hacked in the space of a few days, with the attacker physically controlling them and yelling obscenities through their onboard speakers.
As well as spewing racial slurs", another Ecovacs Deebot under remote control chased its owner's dog around a Los Angeles home. In a statement to ABC News, Ecovacs said:
ECOVACS has always prioritised product and data security, as well as the protection of consumer privacy. We assure customers that our existing products offer a high level of security in daily life, and that consumers can confidently use ECOVACS products.
ABC News noted that researchers at the Australian Centre for Robotics have developed a privacy-preserving" camera that scrambles input before it is used, while retaining enough information for the robot vacuum to operate. But the problem isn't lack of technical solutions here - those often exist. It's the reluctance of companies to take extra steps to protect personal data that is present among the rest of the training set. Given today's pressure to bring out new products with built-in AI as quickly and as cheaply as possible, it seems unlikely that many companies will be willing to do that. As a result, we can probably expect more horror stories of personal or even highly intimate data being leaked by your AI-enhanced" robot vacuum and other so-called smart" devices.