December 2nd 2015, San Bernardino California USA
Two gunmen burst into a party from the local government where one of them actually works and in a shooting spray kill 14 and injure a further 22. In the man/woman hunt following the incident the attackers get shot and killed by the police.
During the investigation an pass-code protected iPhone 5C is recovered belonging to the San Bernardino local authorities which may or may not hold vital information.
This could have been just a headline out of any US-based news agency on any given day in any given year and nobody would have given it much thought several months later, certainly not the entire tech-sector.
So why is this case any different except for the label “terrorism” stuck to it?
In a modern day investigation this all by itself should not be that interesting at all. The authorities (the FBI in this case) ask Apple to assist with gaining access to the associated AppleID, obtain the information synced and/or backed-up to the iCloud servers (if available) and that’s that. Except for, in this case, the AppleID password was changed, the phone could no longer backup nor sync to the Apple servers and the old password was unknown to the investigators as well. So no luck here it seems.
Couple this to the fact that the private phones of the attacking couple had been quite literally destroyed by themselves which, so the FBI would have us believe, was very damaging to their investigation and the protection of the US citizens against terrorism. That, at least according to one source I read, the phone hadn’t even backed itself up to the iCloud servers for almost 2 months prior to the incident probably meant that the automatic backup feature was disabled on the phone itself, but this only has limited impact on what happened next.
Apple: you are ordered to write malware!
The FBI could not gain access to the phone so they had a problem which they thought they could not solve.
Their problem? If you try and guess the unlock code for an Apple device more then the default 10 times incorrectly it will, if activated, wipe itself.
Then why not ask, order with a court ruling, the maker of the phone and it’s security protections to write one special peace of their operating system so you could automate brute-force the pass-code guessing procedure? And to keep the public and possibly the tech-industry quiet you tell them that “its’ only for this phone” in the hopes they believe it too?
Now let’s see what the FBI actually asked for. They requested a special version of the iOS operating system that would allow an unlimited number of pass-code guessing requests to be done without the phone possibly wiping itself after too many failed attempts.
Such a piece of software would, obviously, be usable for any device running that operating system. Any attempts to incorporate software mechanisms to make it work on one specific device only can be removed or changed to work on other devices as well.
Apple could have for instance included the hardware identifier for the specific iPhone 5C in the software such that if it was installed on another device it simply wouldn’t run. However, if you can do that for one device, you can do it for another device too, right?
The problem with all of this is that if Apple or the FBI can gain unlimited access to any iOS device this way, anybody who gets hold of the same programming code or software can potentialy do exactly the same. Besides the fact that if Apple would have created the requested mallware, it probably would have been requested by a large range of non-US governments to have a copy of it for their personal usage, let alone the effect if (I mean when) the code would have leaked to the internet available for all to use.
This would effectively have made any iOS device the practically non-secured available default smartphone or tablet on the market.
Can’t governments keep anything like this secret?
Let’s see, for your personal protection against terrorism the Transport Security Agency (TSA) ordered that all passengers traveling to, from and inside the USA must have a specially designed TSA lock on their luggage that only the TSA, and any other governments that obtain the keys from the TSA, can open and inspect without having to destroy it in the process.
Unfortunately for us travellers the plans for these keys are widely available on the internet and can even be printed using a 3D printer. So to answer the above question, no they can not it seems.
Now some of you may think that I am putting personal privacy above the ability of a government to protect it’s citizens against criminals and terrorist. And to some extend you may be right.
The problem is that any security protection in our modern day life will make it harder, certainly not impossible, for law enforcement agencies to do their job. The problem is that if they would obtain the keys to the kingdom of every person using a certain technology, they would probably misuse that power for more then originally envisioned.
Think about this for a minute, if you could obtain the key to a wealth of information so you only would have to search and combine some of it to do your job in several hours instead of days wouldn’t you want to have that key rather yesterday then not at all?
And that’s precisely what the FBI was trying to obtain by asking Apple to write a malware version of their iOS operating system.
Just before this interesting story would go to court again on March the 22nd, the FBI pooled back and said they no longer needed Apple to write the malware version as they miraculously found another way into the device. Though what that method is they wouldn’t tell which means we probably will know before the year is out anyway.
The fact that a private company needs to stand up and protect us against an over-greedy government that wants to grab all data they can in name of protecting it’s citizens is very disturbing. In my opinion governments should protect not only our safety but our privacy as well. True that the balans between these tasks may be a fine line, but destroying one in favour of the other is certainly not the answer we are looking for.
As for the San Bernardino local authorities, in the end it was their phone and property which they lent to their employee to be used. Could they have done something?
The answer to that is, unfortunately for them, yes they could have. As it’s their device and most likely their information stored on that phone they could have taken their own measures to be able to access the phone in case it was required later.
By using mobile device management (MDM) software it is possible for the company who owns a device to acces it later without the help of the employee who previously used it.
There may be several reasons why you want to have this ability including to obtain relevant data after an employee has left the company. Or maybe even when your employee has lost his/her pass-code and needs to recover important data stored on the device.
That if this mechanism would have been in place would have helped law enforcement as well is obviously true, but not the main reason why a company would want to implement a system of data access possibilities without the user’s help or even in case of forgotten credentials.