Experimental feature

Listen to this article

00:00
00:00
Experimental feature
or

Growing impatient with the restrained debate among the Washington policy wonks, the bearded Silicon Valley engineer stood up and took aim at the man on the stage.

Admiral Mike Rogers, the director of the National Security Agency, had just made his case for a new legal framework to allow the government to monitor data surging through US computer networks. Rising to challenge him was Alex Stamos, then the top security engineer at Yahoo, who denounced the idea that tech companies should build “back doors” into their systems to give governments access to information.

“If we’re going to build defects, back doors or golden master keys for the US government, do you believe — we have about 1.3bn users around the world — we should do [the same] for the Chinese government, the Russian government, the Saudi Arabian government, the Israeli government, the French government?” Mr Stamos asked.

Mr Rogers initially tried to laugh off the attack. But then he struck back against the tech industry’s claims of responsibility to protect users from the prying eyes of government.

“This simplistic characterisation of one side is good and one side is bad is a terrible place for us to be as a nation,” Mr Rogers said. “We have got to come to grips with some really hard, fundamental questions.”

The face-off in February was a dramatic demonstration of the impasse between the US government and America’s most innovative industry. The battle is over encryption — software that uses secret keys to protect credit card details, private emails and corporate secrets from cybercriminals. While tech companies are embracing encryption, Mr Rogers and top Obama administration officials are seeking the right to use secret keys to track terrorists and other criminals.

Before Edward Snowden leaked details of the NSA’s mass surveillance tactics two years ago, only a few technology services used the tough type of encryption that scrambles information so even the tech and communications companies cannot read it: Apple’s iMessage and FaceTime, along with Microsoft’s Skype. But after the revelations, the industry faced a backlash from consumers who felt the tech groups were complicit in allowing their data to be monitored. Now tougher encryption is fast becoming a standard — perhaps offering consumers more privacy protections, but also potentially posing challenges for law enforcement.

WhatsApp, the messaging app owned by Facebook, turned on strong encryption software for Android users late last year. Google and Yahoo are working on a project to allow a similar level of security in their email services by the end of the year. As a result, their users — the companies have a combined 2bn accounts — will have at least the option to encrypt their communication end-to-end by the end of the year.

Yet this kind of encryption can also render invisible information that government officials claim is vital to law enforcement and national security. In the words of James Comey, director of the US Federal Bureau of Investigation, the rise in encryption has meant that huge swaths of the internet have “gone dark”, making it harder to track terrorists and other criminals. In Europe — where the post-Snowden backlash against US tech was the loudest — fears of teenagers using encrypted messaging to communicate with fighters from the Islamic State of Iraq and the Levant (Isis) have led to the threat of new laws to try to stop the tide of encryption.

Until recently, encryption was a prohibitively expensive technology to use on a large scale. Algorithms scramble information into unreadable form, then unscramble it — using lots of processing power.

The relentless fall in the cost of computing was already starting to change that. But tech companies, eager to repair their reputations with customers, have embraced the technology. Google has led the drive, having been humiliated by the revelation that its own government — along with that of the UK — had hacked into its internal network and tapped a trove of unencrypted data. The news highlighted a blind spot in the defences of a company that prided itself on leading the industry in security. Google quickly extended encryption to information it moves wholesale between its own data centres — and launched a campaign to persuade others to use encryption as a default.

Sorry, can’t help you

But if the greater use of technology like this has made illicit government surveillance and criminal hacking more difficult, it is the spread of so-called “strong”, or end-to-end, encryption that has really alarmed law enforcement agencies. These systems take the technology a step further and make it impossible even for the companies that process or carry the data to unscramble it — therefore preventing governments from demanding they hand over information, even with a court order.

IBM, Cisco and others that sell IT and communications systems to companies and governments were also hit hard by the Snowden leaks. Among other things, they fuelled a nationalistic backlash in China as government buyers turned to local tech suppliers, and gave the tech suppliers good reasons to come up with stronger guarantees about data security.

In response, IBM this year opened its mainframe computer technology to allow customers to use their own encryption algorithms — in effect, giving them full control over their information and making it impossible for IBM, or any outside government, to read it.

Big Blue has gone further, licensing its server chip technology to Chinese manufacturers in a way that gives them control over encryption, says Martin Schroeter, IBM’s chief financial officer — a measure he says that was intended to reinforce trust and confidence in the US company’s technology.

Other data services used by businesses have taken a similar stance. Dropbox, one of the most widely used “cloud” data storage services, has opened its platform to let users bring their own encryption. Tech companies argue that such tactics guarantee both the security of digital data and user privacy in the face of both government over-reach and increasingly sophisticated cyber criminals.

As an executive at one big US tech company explains: “We don’t own our customers’ data — we don’t want to be in a situation where we have to hand it over [to government] just because we happen to be processing it.”

Tech executives hope that drawing a clear line will address other doubts about some of their other practices. The Snowden scandal also raised new questions about companies such as Google, which rely on collecting and analysing large amounts of data about their users to sell advertising.

A White House report into Snowden shone the spotlight on the “big data” practices of tech companies like Google. In Silicon Valley, it was seen as a blatant attempt by Washington to deflect attention from its illicit surveillance by dragging the industry through the mud. Offering users deeper encryption is one way to respond to these disputes.

Return of the ‘crypto wars’

The resulting tide of encryption has been vehemently criticised by governments and law enforcement agencies across the US and western Europe.

Mr Comey, the FBI director, has made a series of speeches criticising Apple and Google for going too far with encryption. Most recently, he told Congress that it was critical for his agency to be able to access communications to combat Isis, which is increasingly dependent on the internet.

In Europe, the turnround from shock at US mass surveillance to demanding more rights for European governments to read online communications, has been swift and stark. French intelligence services have won sweeping powers in a bill that legalised phone tapping and email interception passed in May. David Cameron, UK prime minister, has proposed a complete ban on strong encryption to “ensure that terrorists do not have a safe space in which to communicate”.

Many in the cyber security industry, however, claim that an outright ban would be like trying to put the encryption genie back in the bottle. It would be impossible to guarantee that technologies created to give government access to encrypted systems would never fall into the hands of hackers, they argue.

“There’s a lot of threats out there, there are risks to people’s safety,” says Scott Renfro, from Facebook’s security team. “But at the same time, there’s no way to weaken encryption and make it available only to certain parties.”

Whit Diffie, a 71-year old security pioneer and co-inventor of the basic approach used in most modern encryption systems, says systems work best when they are “as simple as possible”. This means that it is counter-productive to try to build in the kind of special access governments are demanding.

“If you compromise the basic tools, you are particularly likely to make exploitation by foreign governments more feasible,” he says.

The fight over encryption has echoes of the 1990s, when the US government pushed for the adoption of a silicon chip it could decrypt remotely. This was before the US launched the “war on terror”, but officials were already arguing for more power to fight terrorism and other crimes such as kidnapping.

The Clipper chip, designed for voice communications, used an encryption algorithm invented by the NSA. The key would be put in escrow until the government gained legal authority to listen to a conversation. But the chip, announced in 1993, was defunct three years later after a backlash from anti-surveillance campaigners and a lack of adoption from manufacturers.

The “crypto war” of the 1990s was only a shadow of what it is now, Mr Diffie says. “Cryptography is extremely important to the whole digital economy. It has developed dramatically since the last time we fought this battle, a little less than 20 years ago.” Encryption is now much harder for governments to resist because it is now “in very, very broad use”.

The cyber security threat is also much greater, with nation states including China and Russia pouring resources into cyber espionage and well-funded organised criminal networks honing their skills.

But many in the tech world argue governments may actually have more access to information than they did before encryption started to become standard. “I don’t think they are magically unable to solve crime any more,” says Bruce Schneier, a cryptographer. “We’re living in a world where there are dozens of investigative, forensic techniques that have improved in the last years — DNA, fingerprints, location.”

No compromise in sight

But with both sides staking out strong positions, it is hard to see where a compromise will be found. Like many of the methods used to fight back against a wave of cyber crime, attempts to deploy encryption risks getting mired in
complex political and legal debates even as hackers power on, learning new tricks.

Google is set to provoke a new showdown between law enforcement and the tech industry this year with the release of its own end-to-end encryption system for use with its Chrome web browser. People close to the project have played down its potential impact, claiming that only a small number of users are likely to want to use it.

Many of the company’s other services rely on the ability to monitor a user’s actions so it can provide relevant information or advertising — an ironic reminder of just how slippery the issue of encrypting data has become. When it meets their needs, even tech companies, like governments, can see the necessity of setting some limits to the use of unbreakable encryption everywhere.

Mr Stamos, who now works at Facebook, blamed the crisis over encryption on law enforcement reaching for what looked like an easy option — without thinking about the consequences for the security of users or the future of the tech industry. “What are the knock-on effects of asking for things like back doors on the competitiveness of the [US] tech industry, for other people using our tech, and other countries asking for back doors themselves?”

Copyright The Financial Times Limited 2017. All rights reserved.
myFT

Follow the topics mentioned in this article