Article 6JYG6 Error Message Exposes Vending Machine’s Use Of Facial Recognition Tech

Error Message Exposes Vending Machine’s Use Of Facial Recognition Tech

by
Tim Cushing
from Techdirt on (#6JYG6)
Story Image

Like most tech, facial recognition AI continues to become cheaper and easier to implement. Is it getting better? Well, that hardly seems to be a primary concern for those deploying it.

Adoption of this tech tends to focus on the law enforcement side of things. This is where it seems to perform worse. The tech is much more unreliable when asked to identify minorities. That's problematic when deployed by the government, which has the power to deprive people of rights and personal freedom when given the go-ahead by tech that performs worse when identifying the very people our government already tends to oppress/over-police most frequently.

The private sector's use of this tech is often no better. While it has some utility for internal use - i.e., verifying the identities of employees seeking to access certain areas or information - the most common deployments are tied to law enforcement: the (hopeful) identification of suspected criminals. So, even most private sector use invokes the excesses of government power, while still relying on faulty tech that generates the most false positives when dealing with people of color.

Is there such a thing as an innocuous deployment of this tech? Sure. There's a chance that might happen. But it would involve telling people this information is being collected while making it clear what this information is being gathered for.

Facial recognition tech in a vending machine is unlikely to aid and abet a string of rights violations. But it's far from innocent. In fact, it tends to disturb people who might otherwise be supportive of government use of this tech.

College students in search of snacks are never going to assume their purchases are triggering facial recognition tech. Wes Davis' brief summary of a much deeper story for The Verge makes it immediately clear how regular people feel about unexpected facial recognition deployments.

Why do the stupid M&M machines have facial recognition?"

A student at the University of Waterloo in Canada asked that in apost showinga vending machine error message that revealed a facial recognition app had failed.

Student publicationmathNEWSfoundthat the machine's maker, Invenda, advertises that it gathers estimated ages and genders of every client." But don't worry,Invenda toldArs Technicathe machines are fully GDPR compliant."

The journalists at Waterloo University's mathNEWS" paper dug a lot deeper into this story. The end result may be the welcome removal of surprisingly intrusive snack machines, but the details show that vending machine manufacturers are willing to deploy this tech without performing much due diligence, but far more reluctant to own up to it.

The first mystery that needed to be solved was identifying which company was specifically responsible for adding facial recognition tech to machines that have generated healthy profits for years without attempting to surreptitiously gather demographic data on their customers.

The error message that inadvertently informed students of the presence of this tech included the name of the vendor:

Invenda.Vending.FacialRecognitionApp.exe
Application Error

Invenda is not the first link in this chain. The machines were placed on the campus by third party vendor Plant Ops. That company claimed to have zero involvement beyond the delivery and placement of the vending machines that were owned and operated by an entirely different company.

That company was Adaria Vending Services. But this third party also does not manufacture or control the machines' operation or internal tech. The tech exposed by this error links back to the company named in the error message: Invenda. Not that Adaria's hands are completely clean, as the student newspaper points out:

Adaria does not make the machines; [journalist] firstie determined the machines' original manufacturer to be Invenda Group, an organization boasting intelligent vending machines with data collection capabilities. Some data collected is benign, including sales and UI performance metrics. But Adaria can also use these machines to collect further data, sending it to relevant parties including Mars, the manufacturer of M&M's. In particular, Invenda's sales brochures state the machines are capable of sending estimated ages and genders of every client.

Two beneficiaries of additional data, although sales and UI performance never necessitate the deployment of facial recognition tech. It's only the latter - the stuff Invenda and its clients want - that can't be gathered by anything other than cameras and tech that phone home with conjecture about age and race as determined by yet another company's facial recognition tech.

According to the statement provided by Adaria, the machines (and the hidden tech) do not take or store" photos of customers. Supposedly the tech acts like a motion sensor, doing nothing more than informing the machine that someone intends to make a purchase.

But a motion sensor is way different than a camera with facial recognition tech attached. While it might be useful to add something that can differentiate between someone standing in front of the machine, rather than someone just near it or passing it, there have been enough advancements in motion detection to accomplish this without the addition of facial recognition tech.

So, this excuse isn't all that credible, even if it may truthfully portray Adaria's relationship to its machines and its apparent data obligations to the manufacturer of the goods located in its vending machines.

Invenda's statement makes it clear Adaria either doesn't completely understand what's going on, or has been forbidden to discuss further details as part of its agreement with Invenda.

As the producer of the Invenda loT solution, the Invenda smart vending machine, and its associated software, we formally warrant that the demographic detection software integrated into the smart vending machine operates entirely locally. It does not engage in storage, communication, or transmission of any imagery or personally identifiable information. The software conducts local processing of digital image maps derived from the USB optical sensor in real-time, without storing such data on permanent memory mediums or transmitting it over the Internet to the Cloud.

They go on to say:

It is imperative to note that the Invenda Software does not possess the capability to recognize any individual's identity or any other form of personal information.

If we take this at face value, the facial recognition tech generates a demographic guess, stores it locally, and discards the images used to make this determination. All well and good. But storing it locally doesn't make much difference overall, since it appears Invenda still harvests this data, even if it requires the deployment of techs to machines to collect it. It sounds like a GDPR workaround that allows Invenda to claim it's not collecting this data remotely or storing it somewhere else than the location where it's being collected.

That still doesn't explain why Invenda now believes it's essential its vending machines attempt to determine the demographics of customers. It also doesn't explain why anyone involved in this - from Invenda to Adaria to the contractor hired to place machines on campuses - have failed to clearly inform vending machine customers this tech has been added to devices most people logically assume do nothing more than exchange goods for money.

As the student paper sums up succinctly:

No one needs M&M's cameras.

These companies got along without this tech for the entirety of their existence. M&M/Mars has managed to turn a steady profit for more than a century without needing to harvest (supposedly anonymized) demographic data via surreptitious deployments of tech many people rightfully do not trust.

The fact that these companies are doing it now might only suggest an ever-increasing thirst for data - something that's understandable as profit margins narrow and more competitors enter the market. That they couldn't be bothered to be upfront about suggests the entities involved are well-aware these deployments would not have been welcomed by their customers. So, they chose to sneak it in, hoping no one would find out until this particular Overton window passed the inflection point.

But, they got caught - sold out by their own defective software and its far too transparent error message. And, now, they're losing customers. As the paper reports, the machines infected with this AI are being removed from campus. I guess everyone on the other side of this food chain had better hope the (supposedly) locally collected data was worth it. And now everyone, everywhere will be deploying more side-eye than money to vending machines.

External Content
Source RSS or Atom Feed
Feed Location https://www.techdirt.com/techdirt_rss.xml
Feed Title Techdirt
Feed Link https://www.techdirt.com/
Reply 0 comments