Why People Don’t Demand Data Privacy, Even As Governments & Corporations Collect More Personal Info
When the Trump administration gave Immigration and Customs Enforcement access to a massive database ofinformation about Medicaid recipientsin June 2025,privacyandmedical justiceadvocates sounded the alarm. They warned that the move could trigger all kinds of public health and human rights harms.
But most people likely shrugged and moved on with their day. Why is that? It's not that people don't care. According to a 2023Pew Research Center survey, 81% of American adults said they were concerned about how companies use their data, and 71% said they were concerned about how the government uses their data.
At the same time, though, 61% expressed skepticism that anything they do makes much difference. This is because people have come to expect that their data will be captured, shared and misused by state and corporate entities alike. For example, many people are now accustomed to instinctively hitting accept" on terms of service agreements, privacy policies and cookie banners regardless of what the policies actually say.
At the same time, data breaches have become aregular occurrence, and private digital conversations exposing everything frominfidelitytomilitary attackshave become the stuff of public scrutiny. The cumulative effect is that people are loath to change their behaviors to better protect their data - not because they don't care, but because they've been conditioned to think that they can't make a difference.
Asscholars of data,technology and culture, we find that when people are made to feel as if data collection and abuse are inevitable, they aremore likely to accept it- even if it jeopardizes their safety or basic rights.
Where regulation falls shortPolicy reforms could help to change this perception, but they haven't yet. In contrast to agrowing number of countriesthat have comprehensive data protection or privacy laws, the United States offers only a patchwork of policies covering the issue.
At the federal level, the most comprehensive data privacy laws are nearly 40 years old. ThePrivacy Act of 1974, passedin the wake of federal wiretappingin the Watergate and the Counterintelligence Program scandals, limited how federal agencies collected and shared data. At the time government surveillance was unexpected and unpopular.
But it also left open a number of exceptions - including for law enforcement - and did not affect private companies. These gaps mean thatdata collected by private companiescan end upin the hands of the government, and there is no goodregulation protecting people from this loophole.
TheElectronic Communications Privacy Act of 1986extended protections against telephone wire tapping to include electronic communications, which included services such as email. But the law did not account for the possibility that most digital data would one day be stored on cloud servers.
Since 2018,19 U.S. stateshave passed data privacy laws that limit companies' data collection activities and enshrine new privacy rights for individuals. However, many of these laws also include exceptions for law enforcement access.
These laws predominantly take a consent-based approach - think of the pesky banner beckoning you to accept all cookies" - that encourages you to give up your personal information even when it's not necessary. These laws put the onus on individuals to protect their privacy, rather than simply barring companies from collecting certain kinds of information from their customers.
The privacy paradoxFor years, studies have shown that people claim to care about privacy but do not take steps to actively protect it. Researchers call this theprivacy paradox. It shows up when people use products that track them in invasive ways, or when they consent to data collection, even when they could opt out. The privacy paradox often elicits appeals to transparency: If only people knew that they had a choice, or how the data would be used, or how the technology works, they would opt out.
But this logic downplays the fact that options for limiting data collection are often intentionally designed to beconvoluted, confusing and inconvenient, and they can leave users feeling discouraged about making these choices, as communication scholarsNora DraperandJoseph Turowhave shown. This suggests that the discrepancy between users' opinions on data privacy and their actions is hardly a contradiction at all. When people are conditioned to feel helpless,nudging them into different decisionsisn't likely to be as effective as tackling what makes them feel helpless in the first place.
Resisting data disaffectionThe experience of feeling helpless in the face of data collection is a condition we calldata disaffection. Disaffection is not the same as apathy. It is not a lack of feeling but rather an unfeeling - an intentional numbness. People manifest this numbness to sustain themselves in the face of seemingly inevitable datafication, the process of turning human behavior into data by monitoring and measuring it.
It is similar to how people choose toavoid the news, disengage from politics or ignore the effects ofclimate change. They turn away because data collection makes them feel overwhelmed and anxious - not because they don't care.
Taking data disaffection into consideration, digital privacy is a cultural issue - not an individual responsibility - and one that cannot be addressed with personal choice and consent. To be clear, comprehensive data privacy law and changing behavior are both important. Butstorytelling can also play a powerful rolein shaping how people think and feel about the world around them.
We believe that a change in popular narratives about privacy could go a long way toward changing people's behavior around their data. Talk of the end of privacy" helps create the world the phrase describes. Philosopher of languageJ.L. Austincalled those sorts of expressionsperformative utterances. This kind of language confirms that data collection, surveillance and abuse are inevitable so that people feel like they have no choice
Cultural institutions have a role to play here, too. Narratives reinforcing the idea of data collection as being inevitable come not only from tech companies' PR machines but also mass media and entertainment, includingjournalists. Theregular cadenceofstories aboutthe federal governmentaccessing personal data, with no mention of recourse or justice, contributes to the sense of helplessness.
Alternatively, it's possible to tell stories that highlight thealarming growth of digital surveillanceand frame data governance practices as controversial and political rather than innocuous and technocratic. The way stories are toldaffects people's capacity to acton the information that the stories convey. It shapes people's expectations and demands of the world around them.
TheICE-Medicaid data-sharing agreementis hardly the last threat to data privacy. But the way people talk and feel about it can make it easier - or more difficult - to ignore data abuses the next time around.
Rohan Grover is Assistant Professor of AI and Media at American University and Josh Widera is a Ph.D. Candidate in Communication at USC Annenberg School for Communication and Journalism. This article is republished from The Conversation under a Creative Commons license. Read the original article.