Listen to this article
When Apple launched its iPhone X in September, a new 3D depth-sensing camera was one of the biggest selling points. On stage at its event in Apple Park, Apple software chief Craig Federighi demonstrated how it could be used to mimic a user’s facial expressions with cartoon emoji, such as robots or monkeys.
“If you were wondering what humanity would do when given the most advanced facial tracking technology available, now you have your answer,” Mr Federighi joked.
Yet the same TrueDepth camera system also underpins a new security system called FaceID. iPhone X owners are able to unlock their phone, log into apps and buy things using Apple Pay by just looking into the device’s camera.
It is an example of how Apple tries to balance security with usability — to make it as easy as possible for its hundreds of millions of customers to protect themselves and the vast array of personal information that they entrust to their iPhones.
Apple says that FaceID is more secure than the fingerprint-based system that it replaces, because the probability that a random person could look at an iPhone and unlock it is one in a million, compared with one in 50,000 fingerprints for TouchID.
The real secret to both FaceID and TouchID, however, is that they are easier to use than manually typing a numerical passcode to unlock the phone. More customers, therefore, are likely to leave this vital security protection turned on. Even the most rock solid of security systems is useless if customers turn it off because it is too complicated.
“FaceID makes using a longer, more complex passcode far more practical because you don’t need to enter it as frequently,” Apple said in a white paper on the security published soon after the iPhone X launch.
Apple’s security team, led by Ivan Krstić, has won increasing respect from researchers in the field over the past few years. Typically, as the volume and variety of a company’s devices on the market increases, the security can often deteriorate. With Apple, even after more than 1.2bn iPhones have been sold over 10 years, its security has been improving.
iPhones and iPads “are legitimately the most secure phones and tablets out there”, says Rich Mogull, chief executive of Securosis, an independent security research and advisory firm. “I don’t know if I can put a timeline on when Apple’s culture changed, but it did,” he says. “They take security and privacy very seriously now and they are getting a little better with every release of hardware and software.”
One key ingredient is the Secure Enclave, an encrypted “coprocessor” in the iPhone’s A-series chips that was first introduced with the iPhone 5s in 2013.
This was the “underpinning for a significant step forward in their security model”, says Pepijn Bruienne, research and development engineer at Duo Security. “They can embed the security architecture in at the silicon layer.”
As Apple has built up the power of its processors, more complex tasks — such as recognising a face from TrueDepth’s 30,000 infrared dots — can be done without having to send any data to the cloud, which might present new security or privacy risks. “FaceID data doesn’t leave your device and is never backed up to iCloud or anywhere else,” Apple’s security paper says.
Strict controls in Apple’s App Store also help keep the system secure. As well as requiring every new app submitted to the store to be reviewed by Apple’s staff before consumers are allowed to download it, the iOS operating system is much more restrictive than Google’s rival, Android, in what apps are able to do.
“The app can’t just go on your phone and start requesting access to your location or contacts” without the user granting their permission, says Andrew Blaich, a researcher at mobile security specialist Lookout. There are also restrictions on reading text messages, overlaying ads and running in the background. “Apple have insulated themselves from a lot of the common attacks that we see on the Android platform day to day,” he says.
1 in 1000
iOS devices facing a threat from apps, according to mobile security specialist Lookout
As a result, in the fourth quarter of 2016 and first quarter of 2017, 47 in 1,000 of Android enterprise devices protected by Lookout encountered app-based threats, compared with only 1 in 1,000 iOS devices.
In August 2016, however, Apple’s pristine security record was put to the test. Lookout, working with the University of Toronto’s Citizen Lab, identified the first “zero-day” — a previously unknown — vulnerability in iOS that had been found in the wild. The exploit — a program designed to take advantage of loopholes — known as Pegasus — was discovered when a human-rights activist in the United Arab Emirates received a text message containing a suspicious link which, if clicked, would have given attackers the ability to intercept his calls and messages.
Though the discovery was a blemish on the iPhone’s record for security, Apple was quick to respond. Within 10 days, it had issued an update to iOS that patched the problem. Here again, Apple’s end-to-end control of its iPhones was vital.
“Because they don’t have to go through multiple levels of hardware manufacturers and carriers to get approval, which can take days, weeks or months, they can push out the patches quickly to a wide variety of devices,” Mr Blaich says. “Getting a similar update for Android out to all their devices would have taken years.”
Many in the security industry believe that other zero-day exploits may exist for iOS but are in the hands of government agencies, rather than being sold commercially on underground hacker marketplaces like many such vulnerabilities.
Apple’s hardline stance on protecting users’ privacy has been a source of conflict with governments, most notably in the case of the San Bernardino mass shooting, when Apple refused to hack the gunman’s iPhone at the behest of the FBI. Ultimately the FBI paid a third party to unlock the phone without Apple’s help, showing that the company’s encryption can be bypassed, albeit on an iPhone 5c, which did not have the new Secure Enclave.
“Some day that zero-day may not be hidden in a government agency and will be used for some mass exploitation,” says Mr Mogull. “But right now they have had a really good track record. Ten years of iPhone and no major malware? That’s unheard of.”