Apple says researchers can vet its child safety features. But it’s suing a startup that does just that.
When Apple announced new technology that will check its US iCloud service for known child sexual abuse material, it was met with fierce criticism over worries that the feature could be abused for broad government surveillance. Faced with public resistance, Apple insisted that its technology can be held accountable.
Security researchers are constantly able to introspect what's happening in Apple's [phone] software," Apple vice president Craig Federighi said in an interview with the Wall Street Journal. So if any changes were made that were to expand the scope of this in some way-in a way that we had committed to not doing-there's verifiability, they can spot that that's happening."
Apple is suing a company that makes software to let security researchers do exactly that.
In 2019, Apple filed a lawsuit against Corellium, which lets security researchers cheaply and easily test mobile devices by emulating their software rather than requiring them to access the physical devices. The software, which also emulates Android devices, can be used to fix those problems.
In the lawsuit, Apple argued that Corellium violated its copyrights, enabled the sale of software exploits used for hacking, and shouldn't exist. The startup countered by saying that its use of Apple's code was a classic protected case of fair use. The judge has largely sided with Corellium so far. Part of the two-year case was settled just last week-days after news of the company's CSAM technology became public.
On Monday, Corellium announced a $15,000 grant for a program it is specifically promoting as a way to look at iPhones under a microscope and hold Apple accountable. On Tuesday, Apple filed an appeal continuing the lawsuit.
In an interview with MIT Technology Review, Corellium's chief operating officer, Matt Tait, said that Federighi's comments do not match reality.
That's a very cheap thing for Apple to say," he says. There is a lot of heavy lifting happening in that statement."
iOS is designed in a way that's actually very difficult for people to do inspection of system services."
iOS is designed in a way that's actually very difficult for people to do inspection of system services."
Matt Tait, Corellium
He is not the only one disputing Apple's position.
Apple is exaggerating a researcher's ability to examine the system as a whole," says David Thiel, chief technology officer at Stanford's Internet Observatory. Thiel, the author of a book called iOS Application Security, tweeted that the company spends heavily to prevent the same thing it claims is possible.
It requires a convoluted system of high-value exploits, dubiously sourced binaries, and outdated devices," he wrote. Apple has spent vast sums specifically to prevent this and make such research difficult."
Surveillance accountability
If you want to see exactly how Apple's complex new tech works, you can't simply look inside the operating system on the iPhone that you just bought at the store. The company's walled garden" approach to security has helped solve some fundamental problems, but it also means that the phone is designed to keep visitors out-whether they're wanted or not.
(Android phones, meanwhile, are fundamentally different. While iPhones are famously locked down, all you need to do to unlock an Android is plug in a USB device, install developer tools, and gain the top-level root access.)
Apple's approach means researchers are left locked in a never-ending battle with the company to try to gain the level of insight they require.
There are a few possible ways Apple and security researchers could verify that no government is weaponizing the company's new child safety features, however.
Apple could hand over the code for review-though this is not something it has said it will do. Researchers can also try to reverse-engineer the feature in a static" manner-that is, without executing the actual programs in a live environment.
Realistically, however, neither of those methods allows you to look at the code running live on an up-to-date iPhone to see how it actually works in the wild. Instead, they still rely on trust not merely that Apple is being open and honest, but also that it has written the code without any significant errors or oversights.
Another possibility would be to grant access to the system to members of Apple's Security Research Device Program in order to verify the company's statements. But that group, Thiel argues, is made up of researchers from outside Apple, is bound by so many rules on what they can say or do that it doesn't necessarily solve the problem of trust.
Apple has spent a lot of money trying to prevent people from being able to jail-break phones."
David Thiel, Stanford Internet Observatory
That really leaves really only two options. First, hackers can jail-break old iPhones using a zero-day vulnerability. That's difficult and expensive, and it can be shut down with a security patch.
Apple has spent a lot of money trying to prevent people from being able to jail-break phones," Thiel explains. They've specifically hired people from the jail-breaking community to make jail-breaking more difficult."
Or a researcher can use a virtual iPhone that can turn Apple's security features off. In practice, that means Corellium.
There are also limits to what any security researcher will be able to observe, but if Apple scans things beyond photos being shared to iCloud, a researcher might be able to spot that.
However, if anything other than child abuse material makes it into the databases, that would be invisible to researchers. To address that question, Apple says it will require that two separate child protection organizations in distinct jurisdictions have the same abuse image in their own databases. But it offered few details about how that would work, who would run the databases, which jurisdictions would be involved, and what the ultimate sources of the database would be.
Real difficultiesThiel points out that the problem Apple is trying to solve is real.
It's not a theoretical concern," he says of child sexual abuse material. It's not something that people bring up just as an excuse to implement surveillance. It is an actual problem that is widespread and needs addressing. The solution is not like getting rid of these kinds of mechanisms. It's making them as impermeable as possible to future abuse."
But, says Corellium's Tait, Apple is trying to be simultaneously locked down and transparent.
Apple is trying to have their cake and eat it too," says Tait, a former information security specialist for the British intelligence service GCHQ.
With their left hand, they make jail-breaking difficult and sue companies like Corellium to prevent them from existing. Now with their right hand, they say, Oh, we built this really complicated system and it turns out that some people don't trust that Apple has done it honestly-but it's okay because any security researcher can go ahead and prove it to themselves.'"
I'm sitting here thinking, what do you mean that you can just do this? You've engineered your system so that they can't. The only reason that people are able to do this kind of thing is despite you, not thanks to you."
Apple did not respond to a request for comment.