Ever since Snowden’s leaking of NSA data raised public awareness about encryption and government breach of privacy, everyone has been scrambling to make their devices safe. Apple has been a leading voice in improving encryption and their own encryption is top notch.
At the outset, the entire Apple v the FBI case was bound to happen sooner or later, and I would be extremely mistaken if Tim Cook had not already prepared himself for this. But it is ultimately such hard, yet necessary decisions that have shaped Apple and made it an admirable company in more ways than one. And right now, Apple is risking quite a lot to stand up for privacy and encryption, and it is doing the right thing.
Clik here to view.

The FBI wants Apple to build a backdoor into iOS, for the iPhone owned by a San Bernardino attacker. (Photograph by Thom.)
The FBI AND THE sAN bERNARDINO CASE
Condemning the attack, Mr Cook has explained how Apple has already provided considerable assistance to the FBI. What they want now is access to the iPhone 5C used by one of the attackers to figure out who they were connected with. The problem, of course, is that iPhones have a security feature that erases all data after ten unsuccessful login attempts, which means the FBI cannot access the device unless they know the passcode.
James Comey, the director of the FBI, has been unusually vocal against Apple, claiming, at one point, as Wired puts it, that the company risks creating an environment in which the United States is “no longer a country governed by the rule of law.” The problem with this argument — as with absolutely any other argument in technology — is that both security and privacy really are all-or-nothing options. Think of guns: guns were invented, the security forces armed themselves, but so did terrorists. Create a backdoor and everyone can enter (we will talk more on this shortly).
Clik here to view.

Security is one of the most important aspect of an Apple device. (Photography by William Iven.)
What the FBI wants is access to data for which they are demanding that Apple create a backdoor into iOS. As Mr Cook puts it, the government could “demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without (users’) knowledge”, which means all this mined data risks being accessible to absolutely anybody.
BRUTE FORCE hacking
To unlock the iPhone, the FBI additionally needs a hardware code which Apple does not keep. Post–iOS 8 (and really, post-Snowden) one of the major security improvements Apple has made is getting rid of its access key. In other words, iOS has been built without Apple’s master key, thereby preventing anyone, including the company, the government, hackers and terrorists, from accessing iPhones.
The ‘key’ to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known… the encryption can be defeated by anyone with that knowledge.
As Johns Hopkins’ cryptography expert, Matthew Green, points out, it may take the FBI ten years to hack into the iPhone by trying millions of possible combinations automatically, using fast computers — a hacking method known as brute force attacking.
What prevents a brute force attack is the iPhone’s auto-erase functionality that is limited to ten unsuccessful login attempts, which means the FBI now wants Apple to somehow remove the auto-erase functionality and provide access to the phone. And this is exactly what Mr Cook wrote about in a letter to all Apple customers, shortly after a judge asked the company to create a backdoor into iOS for the FBI.
End-to-end encryption, for example, requires a key set by your phone and possessed only by the receiver, which can decode and read the data sent. The is done so that nobody can access the data as it is being sent from your phone to another’s because only you and the other person have your unique keys. Now if a backdoor was created, or, say, if that key was also present with Apple (as was the case pre–iOS 8) then they, and through them, anyone, can access the data you sent. Such a scenario would give rise to an environment not unlike the cold war where data is a nuclear weapon — except we have millions of times more data now and risk an eternal nuclear winter. Says Mr Cook in his letter, “… the ‘key’ to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known… the encryption can be defeated by anyone with that knowledge.”
Why backdoors are dangerous
The biggest problem with backdoors is that anybody can enter through them. Today it is the FBI, tomorrow it could be a hacker, and another day a terrorist remotely detonating a bomb through your phone — would you want that?
A chain is only as strong as its weakest link. Apple has worked for decades to build an extremely secure operating system, strengthening the chain, and what the FBI now uses fiat for is to ask Apple to add a weak link to their chain on purpose. This would undermine the security of the entire operating system and make it wholly useless.
Clik here to view.

Google CEO, Sundar Pichai, on Apple v the FBI
Think of building a house. You plan it, construct it, place your safe, lock it tight with a six-digit number lock and then would you let the mason drill a tunnel from his house straight into your safe, because he helped build it? It does not make sense at all. And this is exactly what the FBI wants, except in a digital environment. We never fight cases like this in the physical world because we are more used to it. When was the last time you fought a case against a government asking them not to drill secret tunnels into your house? Would you agree to have one from your house to the local police department?
The problem here is that most people are less accustomed to a digital world than the physical, and fail to realise that building a backdoor into your phone and your computer is really no different from building an all-access tunnel from your home to the government. Ill-intentioned hackers and terrorists do not even need access to the opening of the tunnel — they can just dig until they reach the tunnel at some point and enter it. This analogy to the physical world, in my opinion, is what makes the idea more understandable to most people who are not as well-versed with the digital world as some of us are. It is for this reason that backdoors are, from any point of view, a horrible proposition.
A much bigger debate
The case of Apple v the FBI is really a single instance representing a much larger, deeper debate: between personal security and government intrusion for the purpose of security. The line has to be drawn somewhere and if this makes it hard for the government, you can rest assured it will make it hard for terrorists too.
Clik here to view.

Image courtesy: Flickr/thetaxhaven.
Nobody here — not Apple, not Wired and especially not me — nobody sympathises with terrorists: Mr Cook begins talking about the unfortunate San Bernardino attacks in his letter, emphasising this point. Nor does Sundar Pichai, CEO of Google, who spoke against “requiring companies to enable hacking of customer devices”. I for one, would like to see Google and Microsoft stand up in support for Apple because if Apple bends now, the others will have to as well, sooner or later.
The US government’s response, right now, is that they only need to access the one iPhone, but that, unfortunately is not how the digital world works. If you can gain access to one iPhone, you can gain access to them all, to make a key for one, unfortunately or fortunately, is like making a key to all of them and that will urge hackers on and bring in a new era of cyber crime at unprecedented levels that even governments will not be able to handle.
Further, this is not only a question of creeping from one tech company to another, but also one government to another: should the US government gain access, there is no reason other governments will hold back and before we know it, there will be excess data spillage. And further still, on the lighter side, some people are wondering whether they should laud Apple’s stand or criticise the FBI which does not have the smarts to hack into an iPhone. And then there is Snowden, making cheeky but extremely valid points against the feds:
The @FBI is creating a world where citizens rely on #Apple to defend their rights, rather than the other way around. https://t.co/vdjB6CuB7k
— Edward Snowden (@Snowden) February 17, 2016
At the end of the day, Apple could have simply brushed this off saying iOS was designed to be impenetrable — which it undoubtedly is — and asking the people who made it secure to find loopholes in it is not the best was to go about anything. Yet, the company chose to stand up and refuse to create a backdoor altogether, which is what is most commendable. The FBI, in asking Apple to break into iPhones is asking the company to go against themselves, their employees, their policies, and, most of all, their customers, not to mention all of its shareholders.
Is national security the point? E-mails and/or other forms of communication for people whose behaviours and beliefs are flagged by the system as suspicious should have, if at all effective, pointed to this person long ago. And if anything, governments around the world should start working on strengthening such systems instead of constantly touting that privacy breaches are the answer to everything. The FBI, by asking Apple to create backdoors into iPhones is setting the stage for a problematic future for itself, because, at the end of the day, the world is infinitely safer with a billion people walking around with secure phones than open-access devices when you think about the things a network of hackers can supposedly do to bring down a country to its knees before it realises what is happening.
Governments need to re-think their security strategies from the ground up and not restructure existing laws from the physical world and hope they work as well in binary — they never will.
The post Apple v the FBI — Apple should stand up for encryption appeared first on V.H. Belvadi.