Column: FBI’s security request to Apple could set bad precedent for privacy

In this April 30, 2015, file photo, Apple CEO Tim Cook responds to a question during a news conference at IBM Watson headquarters, in New York. Cook said his company will resist a federal magistrate's order to hack its own users in connection with the investigation of the San Bernardino, Calif., shootings. In a statement posted early Wednesday, Feb. 17, 2016, on the company's website, Cook argued that such a move would undermine encryption by creating a backdoor that could potentially be used on other future devices. (AP Photo/Richard Drew, File)

In the past couple of days, you may have seen a letter from Apple sporadically appearing on your various social media feeds, particularly from your more government-wary friends.

If you haven’t read the letter, it explains that the FBI attempted to seek Apple’s help in de-encrypting the iPhone of Syed Rizwan Farook, one of the two shooters involved in the San Bernardino shooting last December.

Apple stated they have complied hitherto “with the valid subpoenas and search warrants” associated with the case, but the FBI asked the tech giant to create an amended, less secure version of the iOS 9 software (for Farook’s iPhone 5c) to bypass the security code of four to six digits.

Right now, what the FBI has to do to attempt access to the phone is a maneuver called “brute-forcing,” in which someone enters random digits repeatedly until they get the passcode right. However, with Apple’s software as it is, one is only allowed 10 attempts before being completely locked out. There are also time delays in place such the hour long delay between one’s ninth and tenth tries.

Apple says it can take up to five and a half years to try every six-digit alpha-numeric combination, that’s a lot of time to go to waste for a highly time-sensitive issue like a terrorist attack. Fortunately, it is believed Farook used a four-digit password, which only leaves about 10,000 possible combinations. Apple’s compliance would undoubtedly spare some time on the matter.

The FBI rationalizes removing the code involved in this degree of password security would make their job significantly easier, and both they and the White House maintain that the weakened Apple software would be “narrowly tailored” to the San Bernardino incident.

Apple, on the other hand, believes the notion that the government would only use the amended software once is “simply not true.” I find myself agreeing with them: if you were the FBI, and you finally had a technological utility that made the ability to discover information (that could be used to potentially thwart terror attacks) on a popular, commonly used phone, you would only use it once? That seems hard to believe, considering all we know about the NSA post-Edward Snowden.

Adding to suspicions of the allegedly isolated incident, there are currently ways to bypass the 10-entry lockout system the FBI could easily employ, such as turning off the phone after every try, so the phone doesn’t “realize” it’s being hacked. This method is still time-consuming and would require special equipment to conduct efficiently, but it is an option that may suggest a more overarching interest from the FBI in the weakened software.

Fellow tech giants like John McAfee (who, incidentally, is also running for president via the Libertarian party) are thoroughly concerned, saying that “virtually every industry specialist that back doors will be a bigger boon to hackers and to our nation's enemies than publishing our nuclear codes and giving the keys to all of our military weapons to the Russians and the Chinese.” Then again, this man also believes he could “socially engineer” the password out of Farook like most hackers, the fact that Farook and his wife are dead notwithstanding.

Above all, Apple’s forfeiture of an insecure iOS would be a slippery slope for a government perpetually hungry for terroristic information. With the power involved in such a thing, the FBI would have unprecedented access to iPhones, thereby creating an agency with the ability to capriciously obtain data from anybody.

I’m not one to often go shouting “police state,” but this is a dangerous harbinger of impinging upon our right to privacy, and I’m glad Apple has stood its ground so far. A Supreme Court case with the government will tell in the near future.


Stephen Friedland is a staff columnist for The Daily Campus opinion section. He can be reached via email at stephen.friedland@uconn.edu.