Article 52A04 Machine learning could check if you’re social distancing properly at work

Machine learning could check if you’re social distancing properly at work

by
Karen Hao
from MIT Technology Review on (#52A04)
Story Image

Andrew Ng's startup Landing AI has created a new workplace monitoring tool that issues an alert when anyone is less than the desired distance from a colleague.

Six feet apart: On Thursday, the startup released a blog post with a new demo video showing off a new social distancing detector. On the left is a feed of people walking around on the street. On the right, a bird's-eye diagram represents each one as a dot and turns them bright red when they move too close to someone else. The company says the tool is meant to be used in work settings like factory floors and was developed in response to the request of its customers (which include Foxconn). It also says the tool can easily be integrated into existing security camera systems, but that it is still exploring how to notify people when they break social distancing. One possible method is an alarm that sounds when workers pass too close to one another. A report could also be generated overnight to help managers rearrange the workspace, the company says.

Under the hood: The detector must first be calibrated to map any security footage against the real-world dimensions. A trained neural network then picks out the people in the video, and another algorithm computes the distances between them.

Workplace surveillance: The concept is not new. Earlier this month, Reuters reported that Amazon is also using similar software to monitor the distances between their warehouse staff. The tool also joins a growing suite of technologies that companies are increasingly using to surveil their workers. There are now myriad cheap off-the-shelf AI systems that firms can buy to watch every employee in a store, or listen to every customer service representative on a call. Like Landing AI's detector, these systems flag up warnings in real time when behaviors deviate from a certain standard. The coronavirus pandemic has only accelerated this trend.

Dicey territory: In its blog post, Landing AI emphasizes that the tool is meant to keep "employees and communities safe," and should be used "with transparency and only with informed consent." But the same technology can also be abused or used to normalize more harmful surveillance measures. When examining the growing use of workplace surveillance in its annual report last December, the AI Now research institute also pointed out that in most cases, workers have little power to contest such technologies. "The use of these systems," it wrote, "pools power and control in the hands of employers and harms mainly low-wage workers (who are disproportionately people of color)." Put another way, it makes an existing power imbalance even worse.

External Content
Source RSS or Atom Feed
Feed Location https://www.technologyreview.com/stories.rss
Feed Title MIT Technology Review
Feed Link https://www.technologyreview.com/
Reply 0 comments