Tim Cook, CEO of Apple Inc., wrote an open letter to customers explaining the controversial move they are taking with FBI’s request to hack into one of the deceased perpetrators of the San Bernardino terrorist attack. Apple are not complying with the request (to date).

As Tim explained, hacking* into this phone could set a dangerous precedent for consumer privacy and security. However, thoughts and analysis of Apple’s stance is being heavily influenced by the fact that this is a terrorist’s iPhone and could hold valuable information, so exceptions must surely be made.

I’m conflicted on those two points. Privacy and Security vs. Crime and Justice.Apple iphone passcode screen for security

We morally need to hack into this phone, but at the same time, the very actions of doing this could cause wide-spread changes to the way encryption & privacy is handled by companies we entrust with our data.

The challenge Tim explains is that if backdoors are constructed into their products and services they provide, that those backdoors could be exploited by people not authorised to use them, so not law enforcements, but perhaps hackers. Backdoors are effectively weaknesses in the security system.

The second issue Tim doesn’t raise is the fact that even if Apple were to create a backdoor into their iPhones, it doesn’t stop anyone from using additional encryption tools that don’t have backdoors to store their data within. So even though the FBI could get around a person’s iPhone passcode, the user may have further encrypted their data in the apps that are impenetrable.

It’s ironic that law enforcement a few year’s ago asked industry to improve encryption to stop crime and international/state-sponsored hacking, and now it’s asking for it to be weakened.

Encryption needs to be strong and impenetrable. This is what stops someone from draining your bank account and accessing your private medical records. It’s what enables you to backup your private data and files to online cloud services without worrying. If backdoors are constructed and encryption weakened, then not only may law enforcement may be able to access your data (okay) but criminals can use this to further their goals.

*The iPhone in question is an iPhone 5C – this doesn’t have the hardware secure enclave, so Apple could feasible construct a new firmware (so-called FBI iOS) and allow the FBI to try unlimited passcodes without causing the data to wipe. With iPhone 5S, and 6 above, this would not be possible according to industry sources.

 

By Matthew Skipsey, Technical Director