At the end of last month, Apple released a letter to its customers protesting about a US court order that could force the company to give the FBI a back door entry to individual iPhones.
The case has brought the debate about government access to personal data and the protection of civil liberties to the fore once again. It has also made society and industry look more closely at the mechanics of data encryption and ask what makes the technology effective.
At its most basic, encryption provides a layer of protection for data at every stage of its journey from sender to recipient. If anyone tries to intercept or access the data without permission, they find themselves with a screen full of unintelligible gobbledygook. But encryption is only strong if there are no weak links in the chain. Apple argues that the FBI’s court order requesting a back door into its OS (Operating System) would force the company to create such a weak link in its encryption.
This would undoubtedly speed up investigations of high profile crimes, but would come at high cost to the millions of law abiding iPhone users. The newly created back door would very quickly become a target for hackers and cyber criminals; we know from experience that there are plenty of cyber-geeks out there who would be keen to find a key of their own just for the hell of it. But there is also a whole wealth of very unsavoury people with very dark motives waiting to kick in any back door that Apple creates. Unfortunately, despite Apple’s and the FBI’s best intentions and efforts, both sets of people would undoubtedly manage to crack the code in the end – and personal data stored on iPhones – such as bank accounts, health records and even details of frequently visited locations – could be up for grabs.
It’s an opinion shared by California Congresswoman Zoe Lofgren: “Once you have holes in encryption, the rule is not a question of if, but when those holes will be exploited and everything you thought was protected will be revealed.”
Apple contends that the best way to keep the hackers out and keep people’s iPhone data safe is to refrain from building a back door. The company’s stance is that, when it comes to encryption, it’s a case of all or nothing; you can’t ‘partially encrypt’ data and hope for 100 per cent data security. In fact, Tim Cook, Apple’s CEO, is on record as saying: “…there is no such thing as a back door for good guys only.”
And, according to Reuters, White House cybersecurity co-ordinator Michael Daniel, has also acknowledged he knows of no one in the security community who thinks that a back door wouldn’t compromise encryption.
Debates such as this over IT encryption, protection of personal data and defence of national security are not new. There was a similar conversation in the Clinton-era over what was known as the ‘Clipper chip’. The Clipper Chip was a microcircuit that could encrypt data but also give the government access to the keys needed to unlock it again.
The chip faced backlash from the public and was never adopted. And it set an important precedent for encrypted communications in the US. And a similar discussion is taking place back home in the UK right now too. David Cameron’s government is pushing for the Investigatory Powers Bill (dubbed The Snoopers’ Charter) to be made law by the end of 2016 after attempts to introduce a similar bill failed in 2013 under the previous Coalition.
Measures in this bill, heard in the House of Commons at the start of this month, would allow police to break in to electronic devices to investigate or prevent “serious crime” and “death or injury, or damage to a person’s physical or mental health.” So-called “equipment interference” could potentially cover remotely hacking in to phones or computers, or by-passing security on seized equipment.
Back in the USA, the debate between Apple and the FBI continues. At the start of the month, a New York judge ruled that the US government couldn’t use the All Writs Act to make Apple create the back door demanded by the FBI. And, the following day, the debate made its way to the House Judiciary Committee, where James Comey, the FBI’s director freely admitted that it was the most difficult issue he’d ever had to deal with – an acknowledgement that the case is far more complex to resolve than newspaper headlines would have us believe.
It’s not overstating the point to say that this debate could also have a significant impact on the way in which the US carries out global business. Indeed, the issue continues to raise concerns around the world about how secure corporate data is while it is being stored by US companies (either in situ in the US or by US companies operating abroad) or on devices developed by US companies. It’s likely to continue to feed into the wider debate about data privacy which is raging between the US and the EU and make more it difficult for the EU’s Working Groups that are currently finalising the Privacy Shield Protocol to agree on exactly how much protection will be afforded to EU citizens’ data when it’s transferred in and out of the US.
We’ll have to wait to see how the debate between Apple and the FBI plays out and the wider ramifications it has for the security of individual iPhones as well as the ability of government agencies to access personal data. If Apple is forced to create the back door that the FBI wants, then one chilling consequence is that criminals will most likely switch to other, more secure methods to talk to each other – and use apps created by countries outside the US that offer encryption mechanisms that are even more secure than Apple’s.
Perhaps, in this light, the FBI needs to pursue an alternative policy, and get smart enough to beat Apple’s encryption technology itself (and in secret) so that the personal data of law abiding citizens can remain secure while the FBI works diligently to track down those who would commit serious crimes and do harm on a large scale.
Michael Hack, Senior Vice President EMEA Operations at Ipswitch
Image Credit: Shutterstock / Andrey Bayda