Apple’s Encryption Plan Is Good, But Still Leaves Some Questions Unanswered

Recently, Apple announced some quality of life updates for services and devices used by millions. The company opted to give its users more privacy and security by offering them the option to fully encrypt data stored in its cloud service. For years, iCloud accounts have been the endaround for encrypted devices, allowing law enforcement (and malicious hackers) to access content and communications inaccessible through the device that created them.
Apple's announcement made millions of users happy and irritated a single federal law enforcement agency: the FBI. The FBI has, through its directors, consistently advocated for an impossibility: compromised encryption that only law enforcement can utilize. The problem with encryption is either it works or it doesn't. Punching law enforcement-specific holes in it makes it easier for everyone to undermine this protection.
So, the FBI delivered a statement decrying this decision by Apple, claiming it would negatively affect every law enforcement agency. Despite this claim, the FBI was the only law enforcement agency to issue a public statement on iCloud encryption - one issued by an agency that's well into its fifth year of refusing to correct its on misinformation on the negative impact encryption was allegedly having on law enforcement.
Stanford research scholar Rianna Pfefferkorn has more to say about Apple's recently announced changes. While most of it is good, there are still some concerns Apple's statements and iOS alterations don't address.
First, there's the unquestionably good news about Apple's permanent abandonment of plans to engage in client-side content scanning for the children."
Apple confirmed that it is conclusively abandoning the controversial plan it had announced last summer to client-side scan photos as they're being uploaded to iCloud to look for child sex abuse material (CSAM). That plan was put on pause after massive pushback from civil society (including yours truly). Observers commented at the time that this client-side scanning feature, where the scanning would happen before an image hit the cloud, only made sense if Apple was planning to E2EE iCloud (since scanning wouldn't be possible in an E2EE cloud) - and here we are.
Which brings us to the E2EE added to iCloud. It's not on by default. It's an option users need to affirmatively implement. This seems like some sort of internal compromise by Apple, which announced plans to encrypt iCloud content nearly a half-decade ago, but (temporarily) abandoned those plans in order to appease the progressively-more-livid FBI, which had just gotten its compelled encryption dreams dashed by a federal court.
The FBI has proven unworthy of appeasement. It has spent much of the past half-decade lying about the extent of the problem encryption poses to investigators and insinuating that a solution could be had if only the people smart enough to make it happen would stop playing dumb.
There are still plenty of iCloud accounts that won't be encrypted. First and foremost, there will be all the accounts left unprotected by Apple customers who are either unaware of this option or decide they've got nothing to hide, etc. Then there are those specifically carved out of this security measure by Apple - something that appears to be another concession to one particular law enforcement agency.
What's more, not everyone gets to choose it: managed Apple IDs and child accounts aren't eligible for ADP [Advanced Data Protection], according to Apple's documentation.
That decision by Apple is a little strange. And it's potentially going to be a legal issue for the company. As Pfefferkorn points out, the UK and California have enacted laws that increase default protections for children's data. By making this option unavailable to its minor users, Apple may find itself in breach of the law. The UK's Age Appropriate Design Code, California's similarly titled law, and the (horrendous) Kids Online Safety Act all contain mandates that could make this decision by Apple a criminal offense.
All three require online services to set minors' accounts to the most privacy-protective settings by default. Because it's E2EE, ADP is inarguably a higher privacy setting than Apple's standard data protection for iCloud. That seems to tee up a potential conflict with these requirements, insofar as (1) ADP is opt-in, not on by default, and (2) child accounts are not eligible for ADP at all.
Unresolved problems with UK and California law have been created. Somewhat hilariously, the conflict created between the Kids Online Safety Act could actually result in encryption mandates that cover accounts created and run by minors.
In bill text that I believe just came out yesterday, available here, section 4(a)(3) says,
DEFAULT SAFEGUARD SETTINGS FOR MINORS.-A covered platform shall provide that, in the case of a user that the platform knows or should know is a minor, the default setting for any safeguard ... shall be the option available on the platform that provides the most protective level of control that is offered by the platform over privacy and safety for that user."
(The bill defines covered platform" to mean a commercial software application or electronic service that connects to the internet and that is used, or is reasonably likely to be used, by a minor.")
There's no compelling reason/best interest" carve-out here. It therefore seems to me that Senator Blumenthal's bill would require Apple to not just make ADP available for child accounts, but opt them all into it, so that all children's iClouds are E2EE by default.
This is going to generate some friction between the FBI (which wants encryption to be compromised) and federal legislators (who would have to add an encryption mandate targeting Apple). The friction will be wonderful to enjoy secondhand, given that both parties claim they're doing everything they can for the children."
Pfefferkorn says this is not the end of the questions raised by Apple's ADP plans. Since E2EE of iCloud content isn't on by default, it could mean Apple is still eyeing plans to scan content for CSAM, although most likely not on the client side. A large percentage of Apple users are unlikely to opt into encryption, so Apple can continue to scan content for the children" while truthfully claiming users (other than children at the moment) can opt out of this scanning any time by opting into its new E2EE offering.
Perhaps most questionable of all (although not really a question, per se) is this Apple claim, highlighted in Pfefferkorn's excellent post:
In announcing ADP, Apple claimed that E2EE iCloud will eventually be offered in China, too, as part of ADP's gradual global rollout.
Sure, pal. Call me when that happens. Let's move on.
This seems, at best, highly unlikely. Giving Chinese users access to E2EE will likely just see Apple blocked from selling phones in the country that makes so many of them. It also would mean fully abandoning China as a manufacturing base since the government isn't going to give its blessing to Apple manufacturing plants when it knows the end result is Chinese citizens locking the government out of access to locally stored (yes, you read that right) data and communications.
The timetable alone for reducing Apple's dependence on Chinese manufacturing would likely push E2EE implementation in China back for several years. Apple's dependence on Chinese citizens as customers pushes this back even further, putting it right on the border of decades away" and never."
Apple's recent security changes are still a net good, at least for users who can/will take advantage of them. But just because the change is positive doesn't mean Apple shouldn't remain under scrutiny. Every action has reactions and it appears, at least at this point, that acting on behalf of most of its customers may again put it at odds with governments all over the world.