Article 5PC17 UK dials up the spin on data reform, claiming ‘simplified’ rules will drive ‘responsible’ data sharing

UK dials up the spin on data reform, claiming ‘simplified’ rules will drive ‘responsible’ data sharing

by
Natasha Lomas
from Crunch Hype on (#5PC17)

The U.K. government has announced a consultation on plans to shake up the national data protection regime, as it looks at how to diverge from European Union rules following Brexit.

It's also a year since the U.K. published a national data strategy in which said it wanted pandemic levels of data sharing to become Britain's new normal.

The Department for Digital, Culture, Media and Sport (DCPS) has today trailed an incoming reform of the information commissioner's office - saying it wants to broaden the ICO's remit to champion sectors and businesses that are using personal data in new, innovative and responsible ways to benefit people's lives"; and promising simplified" rules to encourage the use of data for research which benefit's people's lives", such as in the field of healthcare.

It also wants a new structure for the regulator - including the creation of an independent board and chief executive for the ICO, to mirror the governance structures of other regulators such as the Competition and Markets Authority, Financial Conduct Authority and Ofcom.

Additionally, it said the data reform consultation will consider how the new regime can help mitigate the risks around algorithmic bias - something the EU is already moving to legislate on, setting out a risk-based proposal for regulating applications of AI back in April.

Which means the U.K. risks being left lagging if it's only going to concern itself with a narrow focus on bias mitigation", rather than considering the wider sweep of how AI is intersecting with and influencing its citizens' lives.

In a press release announcing the consultation, DCMS highlights an artificial intelligence partnership involving Moorfields Eye Hospital and the University College London Institute of Ophthalmology, which kicked off back in 2016, as an example of the kinds of beneficial data sharing it wants to encourage. Last year the researchers reported that their AI had been able to predict the development of wet age-related macular degeneration more accurately than clinicians.

The partnership also involved (Google-owned) DeepMind and now Google Health - although the government's PR doesn't make mention of the tech giant's involvement. It's an interesting omission, given that DeepMind's name is also attached to a notorious U.K. patient data-sharing scandal, which saw another London-based NHS Trust (the Royal Free) sanctioned by the ICO, in 2017, for improperly sharing patient data with the Google-owned company during the development phase of a clinician support app (which Google is now in the process of discontinuing).

DCMS may be keen to avoid spelling out that its goal for the data reforms - aka to remove unnecessary barriers to responsible data use" - could end up making it easier for commercial entities like Google to get their hands on U.K. citizens' medical records.

The sizeable public backlash over the most recent government attempt to requisition NHS users' medical records - for vaguely defined research" purposes (aka the General Practice Data for Planning and Research", or GPDPR, scheme) - suggests that a government-enabled big-health-data-free-for-all might not be so popular with U.K. voters.

The government's data reforms will provide clarity around the rules for the use of personal data for research purposes, laying the groundwork for more scientific and medical breakthroughs," is how DCMS' PR skirts the sensitive health data sharing topic.

After years of inaction against adtech, UK's ICO calls for browser-level controls to fix cookie fatigue'

Elsewhere there's talk of reinforc[ing] the responsibility of businesses to keep personal information safe, while empowering them to grow and innovate" - so that sounds like a yes to data security but what about individual privacy and control over what happens to your information?

The government seems to be saying that will depend on other aims - principally economic interests attached to the U.K.'s ability to conduct data-driven research or secure trade deals with other countries that don't have the same (current) high U.K. standards of data protection.

There are some purely populist flourishes here too - with DCMS couching its ambition for a data regime based on common sense, not box ticking" - and flagging up plans to beef up penalties for nuisance calls and text messages. Because, sure, who doesn't like the sound of a crackdown on spam?

Except spam text messages and nuisance calls are a pretty quaint concern to zero in on in an era of apps and data-driven, democracy-disrupting mass surveillance - which was something the outgoing information commissioner raised as a major issue of concern during her tenure at the ICO.

The same populist anti-spam messaging has already been deployed by ministers to attack the need to obtain internet users' consent for dropping tracking cookies - which the digital minister Oliver Dowden recently suggested he wants to do away with - for all but high risk" purposes.

Having a system of rights wrapping people's data that gives them a say over (and a stake in) how it can be used appears to be being reframed in the government's messaging as irresponsible or even non-patriotic - with DCMS pushing the notion that such rights stand in the way of more important economic or highly generalized social" goals.

Not that it has presented any evidence for that - or even that the U.K.'s current data protection regime got in the way of (the very ample) data sharing during COVID-19... While negative uses of people's information are being condensed in DCMS' messaging to the narrowest possible definition - of spam that's visible to an individual - never mind how that person got targeted with the nuisance calls/spam texts in the first place.

UK's COVID-19 health data contracts with Google and Palantir finally emerge

The government is taking its customary cake and eat it" approach to spinning its reform plan - claiming it will both protect" people's data while also trumpeting the importance of making it really easy for citizens' information to be handed off to anyone who wants it, so long as they can claim they're doing some kind of innovation", while also larding its PR with canned quotes dubbing the plan bold" and ambitious".

So while DCMS' announcement says the reform will maintain" the U.K.'s (currently) world-leading data protection standards, it directly rows back - saying the new regime will (merely) build on" a few broad-brush key elements" of the current rules (specifically it says it will keep principles around data processing, people's data rights and mechanisms for supervision and enforcement").

Clearly the devil will be in the detail of the proposals which are due to be published tomorrow morning. So expect more analysis to debunk the spin soon.

But in one specific trailed change DCMS says it wants to move away from a one-size-fits-all" approach to data protection compliance - and allow organisations to demonstrate compliance in ways more appropriate to their circumstances, while still protecting citizens' personal data to a high standard".

That implies that smaller data-mining operations - DCMS's PR uses the example of a hairdresser's but plenty of startups can employ fewer staff than the average barber's shop - may be able to expect to get a pass to ignore those high standards' in the future.

Which suggests the U.K.'s high standards" may, under Dowden's watch, end up resembling more of a Swiss Cheese...

Data protection is a how to, not a don't do"...

The man who is likely to become the U.K.'s next information commissioner, New Zealand's privacy commissioner John Edwards, was taking questions from a parliamentary committee earlier today, as MPs considered whether to support his appointment to the role.

If he's confirmed in the job, Edwards will be responsible for implementing whatever new data regime the government cooks up.

Under questioning, he rejected the notion that the U.K.'s current data protection regime presents a barrier to data sharing - arguing that laws like GDPR should rather be seen as a how to" and an enabler" for innovation.

I would take issue with the dichotomy that you presented [about privacy vs data-sharing]," he told the committee chair. I don't believe that policymakers and businesses and governments are faced with a choice of share or keep faith with data protection. Data protection laws and privacy laws would not be necessary if it wasn't necessary to share information. These are two sides of the same coin.

The UK DPA [data protection act] and UK GDPR they are a how to' - not a don't do'. And I think the UK and many jurisdictions have really finally learned that lesson through the COVID-19 crisis. It has been absolutely necessary to have good quality information available, minute by minute. And to move across different organizations where it needs to go, without friction. And there are times when data protection laws and privacy laws introduce friction and I think that what you've seen in the UK is that when it needs to things can happen quickly."

He also suggested that plenty of economic gains could be achieved for the U.K. with some minor tweaks to current rules, rather than a more radical reboot being necessary. (Though clearly setting the rules won't be up to him; his job will be enforcing whatever new regime is decided.)

If we can, in the administration of a law which at the moment looks very much like the UK GDPR, that gives great latitude for different regulatory approaches - if I can turn that dial just a couple of points that can make the difference of billions of pounds to the UK economy and thousands of jobs so we don't need to be throwing out the statute book and starting again - there is plenty of scope to be making improvements under the current regime," he told MPs. Let alone when we start with a fresh sheet of paper if that's what the government chooses to do."

TechCrunch asked another Edwards (no relation) - Newcastle University's Lilian Edwards, professor of law, innovation and society - for her thoughts on the government's direction of travel, as signalled by DCMS' pre-proposal-publication spin, and she expressed similar concerns about the logic driving the government to argue it needs to rip up the existing standards.

The entire scheme of data protection is to balance fundamental rights with the free flow of data. Economic concerns have never been ignored, and the current scheme, which we've had in essence since 1998, has struck a good balance. The great things we did with data during COVID-19 were done completely legally - and with no great difficulty under the existing rules - so that isn't a reason to change them," she told us.

She also took issue with the plan to reshape the ICO as a quango whose primary job is to drive economic growth' " - pointing out that DCMS' PR fails to include any mention of privacy or fundamental rights, and arguing that creating an entirely new regulator isn't likely to do much for the public trust' that's seen as declining in almost every poll."

She also suggested the government is glossing over the real economic damage that would hit the U.K. if the EU decides its reformed" standards are no longer essentially equivalent to the bloc's. [It's] hard to see much concern for adequacy here; which will, for sure, be reviewed, to our detriment - prejudicing 43% of our trade for a few low value trade deals and some hopeful sell offs of NHS data (again, likely to take a wrecking ball to trust judging by the GPDPR scandal)."

She described the goal of regulating algorithmic bias as applaudable" - but also flagged the risk of the U.K. falling behind other jurisdictions which are taking a broader look at how to regulate artificial intelligence.

Per DCMS' press release, the government seems to be intending for an existing advisory body, called the Centre for Data Ethics and Innovation (CDEI), to have a key role in supporting its policymaking in this area - saying that the body will focus on enabling trustworthy use of data and AI in the real-world". However it has still not appointed a new CDEI chair to replace Roger Taylor - with only an interim chair appointment (and some new advisors) announced today.

The world has moved on since CDEI's work in this area," argued Edwards. We realise now that regulating the harmful effects of AI has to be considered in the round with other regulatory tools not just data protection. The proposed EU AI Regulation is not without flaw but goes far further than data protection in mandating better quality training sets, and more transparent systems to be built from scratch. If the UK is serious about regulating it has to look at the global models being floated but right now it looks like its main concerns are insular, short-sighted and populist."

Patient data privacy advocacy group MedConfidential, which has frequently locked horns with the government over its approach to data protection, also queried DCMS' continued attachment to the CDEI for shaping policymaking in such a crucial area - pointing to last year's biased algorithm exam grading scandal, which happened under Taylor's watch.

(NB: Taylor was also the Ofqual chair, and his resignation from that post in December cited a difficult summer", even as his departure from the CDEI leaves an awkward hole now... )

The culture and leadership of CDEI led to the A-Levels algorithm, why should anyone in government have any confidence in what they say next?" said MedConfidential's Sam Smith.

Techcrunch?d=2mJPEYqXBVI Techcrunch?d=7Q72WNTAKBA Techcrunch?d=yIl2AUoC8zA Techcrunch?i=dWb9LtR16i8:IHuTVPLtdSw:-BT Techcrunch?i=dWb9LtR16i8:IHuTVPLtdSw:D7D Techcrunch?d=qj6IDK7rITsdWb9LtR16i8
External Content
Source RSS or Atom Feed
Feed Location http://feeds.feedburner.com/TechCrunch/
Feed Title Crunch Hype
Feed Link https://techncruncher.blogspot.com/
Reply 0 comments