Can You Protect Privacy If There's No Real Enforcement Mechanism?
Privacy laws can have a lot of moving pieces from notices anddisclosures, opt-in and opt-out consent requirements to privacydefaults and user controls. Over the past few years, there has beensignificantprogresson these issues because privacy advocates, consumer groups, industryvoices, and even lawmakers have been willing to dive intodefinitional weeds, put options on the table, and find middle ground.But this sort of thoughtful debate has not happened when it comes tohow privacy laws should be enforced and what should happen whencompanies screw up, families are hurt, and individuals' privacyis invaded.
Instead, when it comes to discussingprivate rights of action and agency enforcement, rigid red lines havebeen drawn. Consumer groups and privacy advocates say letindividuals sue in court -- and callit a day. Business interests, when they talk aboutstrongenforcement," often mean letting an underfundedFederal Trade Commission and equally-taxed state Attorneys Generalhandle everything. Unfortunately, this binary,absolutist dispute over policing privacy rightsthreatens to sink any progress on privacy legislation.
It happened in Washington state, whichfailed to enact a comprehensive privacy framework in March becauseof a single sentence that could have let someconsumers sue to enforce their rights under the state's generalConsumer Protection Act. Private rights of action have stymied stateprivacy task forces, and the issue is consuming efforts by theUniformLaw Commission to craft a model privacy bill. This isbut a microcosm of what we've seen at the federal level, wherelawmakers are at loggerheads"over private rights of action.
This impasse is ridiculous. Advocacygroups share some blame here, but industry voices have failed to putany creativity into putting an alternative path forward. Companyafter companyand tradeassociation after tradeassociation have come out in favor of privacy rules,but the response to any concern about how to ensure those rules arefollowed has been crickets. Few seem to have given much thought intowhat enforcement could look like beyond driving a Brinks truck fullof money up to the FTC. That is not good enough. If industry isserious about working toward clear privacy rules, business interestshave two obligations: (1) they should offer up some new ideas toboost enforcement and address legitimate concerns about regulatorylimitations and capture; and (2) they need to explain why privaterights of action should be a non-starter in areas where businessesalready are misbehaving.
First, while we can acknowledge thegood work that the FTC (and state Attorney Generals) has done, weshould also concede that agencies cannot address every privacyproblem and have competing consumer protection priorities.Commentators laudthe FTC's privacy work but have not suggested how an FTC withmore resources will not just do more of what it'salready doing. There are outstanding considerations animating effortsto create an entirelynew federal privacy agency (and that's on top ofa proposal in California to set up its own entirely new PrivacyProtection Agency"). Improving the FTC's privacy posturewill require more than more money and personnel.
Part of this will be creatingmechanisms that ensure individuals can get redress. One idea would beto require the FTC to help facilitate complaint resolutions. TheConsumer Financial Protection Bureau alreadydoes this to some extent with respect to financialproducts and services. The CFPB welcomes consumer complaints -- andthen works with financial companies to get consumers a directresponse about problems. These complaints also help the CFPB identifyproblems and prioritize work, and then CFPB publishes (privacyfriendly) complaint data. This stands in contrast to the FTC'sConsumerSentinel Network, which is a black box to the public.
Indeed, the FTC's complaintsystem is opaque even to complainants themselves. The black boxnature of the FTC is, fairly or not, a constant criticism by privacyadvocates. A group of advocates began the Trump administration bycallingfor more transparency from the Commission about how ithandles complaints and responds to public input. I can speak to thisissue, submitting myown complaint to the FTC about the privacy andsecurity practices of VPNs in 2017. Months later, the FTC put out abriefblog post on the issue, which I took to be the end ofthe matter on their end. Some sort of dualtrackinformal and formal complaint process like the FederalCommunications Commission could be one way to ensure the FTC bettercommunicates with outsiders raising privacy concerns.
These are mostly tweaks to FTC process,however, and while they address some specific complaints aboutprivacy enforcement, they don't address concerns thatregulators have been missing -- or avoiding -- some of the biggestprivacy problems we face. This is where the rigid opposition toprivate rights of action and failure to acknowledge the largerconcern is so frustrating.
Sensitive data types present a goodexample. Unrestrained collection and use of biometricsand geolocationdata have become two of the biggest privacy fights ofthe moment. There has been a shocking lack of transparency orcorporate accountability around how companies collect and use thisinformation. Their use could be the key to combating the ongoingpandemic; their misuse a tool for discrimination, embarassment, andsurveillance. If ever there were data practices where more oversightis needed, these would be it.
Yet, the rapid creep of facialrecognition gives us a real-world test case for how agencyenforcement can be lacking. While companies have been calling fordiscussions about responsible deployment of facial recognition evenas they pitch this technology to every school, hospital, and retailerin the world, Clearview AI just up and ignored existing FTCguidance and state law. Washington state has anexisting biometric privacy law, which the state Attorney Generaladmitted has never been the basis of an enforcement action. To myknowledge, the Texas Attorney General also has never brought a caseunder that state's law. Meanwhile, the Illinois BiometricPrivacy Act (BIPA) maybe theone legal tool that can be used to go after companieslike Clearview.
BIPA's private right of actionhas been a recurring thorn in the sides of major social mediacompanies and theme parks rolling out biometrics technologies, but noone has really cogently argued that companies aren't flagrantlyviolating the law. Let's not forget that facial recognitionsettings were anunderappreciated part of the FTC's most recentsettlement with Facebook, too. However, no one can actually discusshow to tweak or modernize BIPA because industry groups have had asingle-minded focus on stripping the law of all its privateenforcement components.
Industry has acted in lockstep toinsist it is unfair for companies to be subject to limitlessliability by the omnipresent plaintiffs bar for every minor ortechnical violation of the law. And that's the rub!
There is no rule that says a privateright of action must encompass the entirety of a privacy law. One ofthe compromises that led to the California Consumer Privacy Act wasthe inclusion of a private right of action for certain unreasonabledata breaches. Lawmakers can take heed and go provision-by-provisionand specify exactly what sorts of activities could be subject toprivate litigation, what the costs of the litigation might be, andwhat remedies can ultimately be obtained.
The U.S. Chamber of Commerce has beenat the forefront of insistingthat private rights of action are poor tools for addressing privacyissues, because they can undermine appropriate agencyenforcement" and hamper the ability of expert regulatorsto shape and balance policy and protections." But what'sthe objection then in areas where that's not true?
The sharing and selling of geolocationinformation has become especially pernicious, letting companies infersensitive health conditions and facilitating stalking. Can anyindustry voice argue that companies have been well-behaved when itcomes to how they use location information? The FTC clearly stated in2012 that precise geolocation data was sensitive informationwarranting extra protections. Flash forward to 2018and 2019,where The New York Times is engaged in annual exposeson the wild west of apps and services buying and selling anonymous"location data. Meanwhile, the Communications Act requires carriers toprotect geolocation data, and yet the FCC finedall four major wireless carriers a combined $200million for sharing their subscribers' geolocation data withbounty hunters and stalkers in February of this year.
Businesses do not need regulatoryclarity when it comes to location data -- companies need to put in apenalty box for an extended timeout. Giving individuals the abilityfor private injunctive relief seems hardly objectionable given thistrack record. Permitting class actions for intentional violations ofindividuals' geolocation privacy should be on the table, aswell.
There should be more to discuss than auniverse where trial attorneys sue every company for every privacyviolation or a world where lawmakers hand the FTC a blank check.Unfortunately, no one has yet put forward a vision for what theoptimum level of privacy enforcement should be. Privacy researchers,advocates, and vulnerable communities have forcefully said the statusquo is not sufficient. If industry claims it understands theimportance of protecting privacy but just needs more clarity aboutwhat the rules are, companies should begin by putting forward someplans for how they will help individuals, families, and communitieswhen they fall short.
Joseph Jerome, CIPP/US, is a privacy and cybersecurity attorney based in Washington, D.C. He currently is the Director of Multistate Policy for Common Sense Media.