Still Not 'Going Dark:' Device Encryption Still Contains Plenty Of Exploitable Flaws
Law enforcement -- especially at the federal level -- has spent a great deal of time complaining about an oddity known only to the FBI and DOJ as "warrant-proof" encryption. Device users and customers just call this "encryption" and realize this protects them against criminals and malicious hackers. The federal government, however, sees device encryption as a Big Tech slap in the face. And so they complain. Endlessly. And disingenuously.
First off, law enforcement has access to a wide variety of tech solutions. It also has access to plenty of communications and other data stored in the cloud or by third parties that encryption can't protect. And it has the users themselves, who can often be persuaded to allow officers to search their devices without a warrant.
Then there's the protection being handed out to phone users. It's got its own problems, as Matthew Green points out:
Authorities don't need to break phone encryption in most cases, because modern phone encryption sort of sucks.
More specifically, even the gold standard (Apple's) for encryption still leaves some stuff unencrypted. Once unlocked after a period of rest (say, first thing in the morning), the phone is placed into an "AFU" (after first unlock) state where crypto keys are stored in the phone. These stay in memory until erased. Most common use of phones won't erase them. And they're only erased one at a time, leaving several sets resident in memory where cops (and criminals!) using phone-cracking tech can still access them.
A report [PDF] put together by Matthew Green, Maximilian Zinkus, and Tushar Jois highlights the exploitable flaws of device encryption efforts by Apple, Google, and other device manufacturers. And there's not a lot of darkness going on, despite law enforcement's protestations.
This reaction is best exemplified by the FBI's Going Dark" initiative, which seeks to increase law enforcement's access to encrypted data via legislative and policy initiatives. These concerns have also motivated law enforcement agencies, in collaboration with industry partners, to invest in developing and acquiring technical means for bypassing smartphone security features. This dynamic broke into the public consciousness during the 2016 Apple v. FBI" controversy, in which Apple contested an FBI demand to bypass technical security measures. However, a vigorous debate over these issues continues to this day. Since 2015 and in the US alone, hundreds of thousands of forensic searches of mobile devices have been executed by over 2,000 law enforcement agencies, in all 50 states and the District of Columbia, which have purchased tools implementing such bypass measures.
The research here shows there's no need for legislative mandates or court orders to access most of the contents of suspects' iPhones. There's plenty to be had just by exploiting the shortcomings of Apple's built-in encryption.
[W]e observed that a surprising amount of sensitive data maintained by built-in applications is protected using a weak available after first unlock" (AFU) protection class, which does not evict decryption keys from memory when the phone is locked. The impact is that the vast majority of sensitive user data from Apple's built-in applications can be accessed from a phone that is captured and logically exploited while it is in a powered-on (but locked) state.
This isn't theoretical. This has actually happened.
[W]e found circumstantial evidence in both the DHS procedures and investigative documents that law enforcement now routinely exploits the availability of decryption keys to capture large amounts of sensitive data from locked phones. Documents acquired by Upturn, a privacy advocate organization, support these conclusions, documenting law enforcement records of passcode recovery against both powered-off and simply locked iPhones of all generations.
Utilizing Apple's iCloud storage greatly increases the risk that a device's contents can be accessed. Using iCloud to sync messages results in the decryption key being uploaded to Apple's servers, which means law enforcement, Apple, and malicious hackers all have potential access. Device-specific file encryption keys also make their way to Apple via other iCloud services.
Over on the Android side, it's a bigger mess. Multiple providers and manufacturers all use their own update services. Devices are phased out for ongoing protection by both, resulting in user confusion as to which devices still offer software updates and the latest in encryption tech. Cheaper devices sometimes bypass these niceties entirely, leaving low-cost options the most vulnerable to exploitation. And, while the DOJ and FBI may spend the most time complaining about Apple, it only commands about 15% of the smartphone market. This means most devices law enforcement seizes aren't secured by the supposedly "impenetrable" encryption provided by Apple.
Google's cloud services offer almost no protection for Android users. App creators must opt in to certain security measures. In most cases, data backed up to Google's cloud services is only protected by encryption keys Google holds, rather than the user uploading the data. Not only is encryption not much of a barrier, but neither is the legal system. A great deal of third party data -- like the comprehensive data sets maintained by Google -- can be accessed with only a subpoena.
The rest of the report digs deep into the strengths and limitations of encryption offered to phone users. But the conclusion remains unaltered: law enforcement does have multiple ways to access the contents of encrypted devices. And some of these solutions scale pretty easily. While it's not cheap, it's definitely affordable. While there will always be those who "got away," law enforcement isn't being hindered much by encryption that provides security to all phone users, whether or not they're suspected of criminal activity.