Article 6GB2H The US military’s privacy problem in three charts

The US military’s privacy problem in three charts

by
Tate Ryan-Mosley
from MIT Technology Review on (#6GB2H)
Story Image

This article is from The Technocrat, MIT Technology Review's weekly tech policy newsletter about power, politics, and Silicon Valley. To receive it in your inbox every Friday, sign up here.

Highly personal and sensitive data about military members, such as home addresses, health and financial information, and the names of family members and friends, is easily accessible to anyone who wants to buy it. It's for sale for as little as $0.12 per record by US-based data brokers.

That's the finding ofa new reportfrom Duke University researchers that shows how data brokers are selling this sort of information with minimal vetting to customers both domestically and overseas-creating major privacy and national security risks. Iwrote about the report for a storythis week.

The research was concerning to members of Congress, including Senators Elizabeth Warren and Ron Wyden, who commented in the piece. Then on November 7, Senator John Cornyncitedmy storyin a hearingon the harms of social media.

If you want to learn more about the study and what it means for US national security,take a read throughmy piece.

But I want to give you a more in-depth sense of just what kinds of data is for sale by these brokers, as well as their economic model, using charts. (All charts are based on data provided by the Duke researchers and available in the report.)

First,take a look at just how personal the information is that the researchers were able to purchase-from people's net worth to whether they have diabetes. The Duke team purchased a total of eight different data sets from three brokers (which they don't identify by name in the report). The chart below is sorted by data set, and you can see that some information, like emails and home addresses, is widely accessible through multiple providers. I, for one, was surprised to see how frequently information about their children and homeowner status came up.

(If you click on the chart, you'll be taken to interact with itand see more information about which data was given to buyers with emails both domestic and foreign.)

Another concerning" finding, to use the lead researcher's word, was thebrokers' willingness to sell this information to clients outside the US.The researchers, who were particularly interested in the national security risk created by the industry, set up an email address from a US-based domain, and one from an Asia-based domain, which they sent from an IP address in Singapore. As I detail in the story,these brokers conducted minimal vetting regardless of where the inquiry came from, and almost all of them ultimately provided many of the same types of information no matter the geographic source of the request.

Here, I've broken outwhat types of data were available based on the origin of the request.It's worth noting that these results are just a reflection of the data the researchers purchased and do not provide a comprehensive view of what the industry sells and to whom. That is to say, just because the researchers did not get health data when inquiring from an Asia-based domain doesn't mean it's not possible to purchase this data through other providers. Additionally, the specific fields do vary across the different categories.

The part of the report that was perhaps most astonishing to me concerns theeconomic model of the brokers, which incentivizes clients to buy in bulk and greatly erode people's privacy.The more data someone purchases, the cheaper it gets. One broker sent pricing options to the researchers to demonstrate how this model works, which you can see in the graph below.

The data the Duke researchers purchased ranged from $0.12 to $0.32 per record-mere pennies to violate the privacy of US military members.

As one of the report authors, Hayley Barton, told me, In practice, it seems as though anyone with an email address, a bank account, and a few hundred dollars could acquire the same type of data that we did."

What else I'm reading
  • This story aboutthe changing trust and safety industryfrom Wired's Vittoria Elliott was a great read! There have been a lot of changes to how Big Tech is approaching their own accountability mechanisms, and outsourcing seems to be the path of the future.
  • Here's along, buckle-in readabout Peter Thiel's politics and his attempts to live forever, from Barton Gellman at the Atlantic. It's a fascinating, and a tad excruciating, peek into the ego of one of tech's overlords.
  • New research from startup Vectra found that chatbots like ChatGPTmake things up at least 3% of the time. As my colleagueMelissa has reported, it's important to remember that predictive systems ... are generating the most likely words, given your question and everything they've been trained on," which clearly doesn't make them accurate.
What I learned this week

What do some of the most brilliant people think the hardest problems are today, and what do they think we should do about them? MIT Technology Review queried scientists, journalists, politicians, entrepreneurs, activists, and CEOs like Bill Gates, Lina Khan, and Fei-Fei Li, andpublished a sample of the best responsesas part of our magazine issue on hard problems.

One of my favorite responses was from NYU professor and authorMeredith Broussard, who said, We should focus on problems of racism, sexism, and ableism-but we should not expect tech or technologists to deliver complete solutions. Computers are machines that do math-no more, no less."

External Content
Source RSS or Atom Feed
Feed Location https://www.technologyreview.com/stories.rss
Feed Title MIT Technology Review
Feed Link https://www.technologyreview.com/
Reply 0 comments