Online Facial Recognition Service Caught Cruising Through Graveyards To Fill Its Database

Not literally, of course. Let's get that out of the way. The company was not sending people out to dig up bodies to take photos to add to its facial recognition database. I mean, how would that even work. Not only that, but very few desiccated corpses utilize subscription-based facial recognition services. It's not a workable business model.
Neither is this, though. This really shouldn't be happening. Here's a far more realistic depiction of what happened, as reported by Lydia Morrish for Wired.
Ancestry.com isn't the only site that Scarlett checks regularly. In February 2022, the facial recognition search engine PimEyes surfaced non-consensual explicit photos of her at age 19, reigniting decades-old trauma. She attempted to get the pictures removed from the platform, which uses images scraped from the internet to create biometric faceprints" of individuals. Since then, she's been monitoring the site to make sure the images don't return.
In January, she noticed that PimEyes was returning pictures of children that looked like they came from Ancestry.com URLs. As an experiment, she searched for a grayscale version of one of her own baby photos. It came up with a picture of her own mother, as an infant, in the arms of her grandparents-taken, she thought, from an old family photo that her mother had posted on Ancestry. Searching deeper, Scarlett found other images of her relatives, also apparently sourced from the site. They included a black-and-white photo of her great-great-great-grandmother from the 1800s, and a picture of Scarlett's own sister, who died at age 30 in 2018. The images seemed to come from her digital memorial, Ancestry, and Find a Grave, a cemetery directory owned by Ancestry.
That would be the unfortunate findings of software engineer Cher Scarlett who, as the Wired article points out, has had previous traumatic run-ins with the Poland-based PimEyes. (PimEyes has offered a pretty defensive response to that reporting on its blog.)
PimEyes is, at best, controversial. It provides a for-pay service that allows users to perform reverse images on photos of themselves to discover where else these photos may have been posted on the internet. PimEyes is basically Clearview, except for regular people (although cops use it too).
Unlike Clearview, PimEyes positions itself as a service that only allows users to search for images of themselves, rather than just upload a photo of anyone and send PimEyes digging through its database of info scraped from publicly accessible websites. Obviously, internet sleuths seeking to identify people who participated in the January 6th insurrection weren't uploading photos of themselves. The potential for abuse (which has been previously realized) is always present. And PimEyes' landing page doesn't exactly deter people from assuming this is a reverse image search they can engage by uploading any photo.
This particular scraping is the sort of thing you don't expect, even from controversial services that make webscraping part of the business model. However, PimEyes says the scraping of Ancestry - which violated Ancestry's terms of service - was a mistake.
Giorgi Gobronidze, PimEyes' director, tells WIRED: PimEyes only crawls websites who officially allow us to do so. It was ... very unpleasant news that our crawlers have somehow broken the rule." PimEyes is now blocking Ancestry's domain and indexes related to it are being erased, he says.
It never should have happened in the first place. This may have been a mistake, but mistakes like this matter when your company has already generated a lifetime of privacy and security concerns. Sure, people can opt out. But only the ones who are still living.