Every week on Criminal Minds, computer whiz Penelope Garcia hits a few keys and in seconds provides any information the Behavioral Analysis Unit needs to find and put away the criminals. The good guys use technology to get the bad guys. Any criminal worth his salt knows to use a burner phone and to get rid of it immediately after using it. Think of all the incriminating evidence the police or FBI could pull off of his cell phone about his actions and potentially about the actions of other criminals.
Then along comes the case of the San Bernardino shooter Syed Farook. The FBI have his cell phone because he and his co-conspirator wife, Tashfeen Malik, died in a police shootout on December 2, 2015, after killing fifteen people and wounding twenty-one others at a Christmas party at a center for the disabled in the deadliest mass shooting in the country since Sandy Hook. There is no doubt that the pair were terrorists. So call in the FBI’s real-life version of Penelope Garcia. What could be simpler?
The small matter of a cell phone’s security code has become the center of a raging legal battle. Farook’s phone, like most, has a four-digit security code that prevents anyone besides the owner from accessing the data stored on it. In the world of James Bond, there would be a program that would input thousands of possible combinations of numbers until it hit the right one, but the phone is programmed to lock after ten unsuccessful attempts. The logical next step was to ask Apple to break the code on the phone. That’s when the legal battle began. Apple CEO Tim Cook continues to state his company’s resolve to fight a court order delivered in February that would force Apple to develop a new customized operating system firmware to remove the passcode lockout on the seized iPhone.
This case provides practice in applying argument theory to real life. The claim being made by each side is clear. Each is a claim of policy. One side argues that Apple should develop the means of breaking the security code. The other argues that the company should not. Consider the support being offered on each side of the argument. Apple executives have called the new OS a “government OS” and argue that the court order violates Apple’s First and Fifth Amendment rights. They fear setting a dangerous precedent. If they come up with a code to break into Farook’s phone, is the security of other phones in jeopardy? On the other side of the battle is the Department of Justice’s legitimate concern for national security. Officials have even admitted that there is probably no useful information on the phone. Farook and Malik destroyed other phones before their crime, and the phone in question was a company phone that Farook used on the job. All but the most recent data on it has already been downloaded from the Cloud. Government officials also worry about precedent. If Apple will not help in this case, what would happen if there were vital information on a phone in the future that could possibly prevent a major terrorist attack?
The argument appeals to some of our most basic fears: threats to our liberty and threats to our feeling of security. Which side of the argument you find more convincing depends on how much of a threat to the privacy of innocent Americans you feel is posed by developing a means of breaking the phones’ security code and, on the other hand, how much of a threat to national security is posed by Apple’s refusal to develop a “government OS.” The battle is far from over. One reason that both sides are taking it so seriously is that any legal ruling on the issue could have far-reaching effects.
[Photo Source: Incase on Flikr]