Sunday, February 21, 2016

To Lock or Unlock, That is the Question

When I learned that China felt the need to purge Microsoft’s operating systems from their software options, I was less than puzzled. These complicated systems are rife, in Chinese eyes, with myriad little opportunities called “back doors.” These are purposely-designed hidden elements embedded in the millions and millions of lines of computer code, necessary to make the systems work, that allow a knowledgeable outsider to hack into any document or creation based upon that operating system or any other software with such a purposeful vulnerability. The doors may just be open or timed to open later or simply need the right input to allow the outsider full access. And trust me, intelligence agencies understand these little openings all-too-well.
On the other hand, with so many criminal cartels and terrorist threats, police and intelligence agencies are demanding that tech software always be subject to access from proper authorities. Tech companies, knowing that such intentional vulnerability will threaten their customers’ privacy, are fighting back. They claim that succumbing to police pressure to break into their technology is un-American, making us more of a police state that most realize. Perhaps there is also a fear that having American tech companies with back doors creates a competitive disadvantage against foreign manufacturers who are not so limited.
What’s grabbing the headlines in the tech world these days is the FBI request, now an order from a federal magistrate, for Apple to help crack what has to date been the uncrackable data in one of the San Bernardino shooter’s iPhone. Apple CEO, Tim Cook, has vowed to fight the order, all the way up to the Supreme Court if necessary. He’s even written an open letter to iPhone customers explaining his position, fighting what he calls a dangerous precedent. You can find the full text of that letter here: http://www.apple.com/customer-letter/
Among other things, Cook writes: “Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.
“In today’s digital world, the ‘key’ to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.
“The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.
“The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.
“We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.” The government claims that the lives of its citizens justifies the necessity to have Apple cooperate in accessing that data. Apple claims that the phone was designed to be unhackable, so there is no existing “back door,” and they are unwilling to try and create one or even to engage in a “one shot hack” of this particular device as required in the court order.
What’s more, Apple appears to be accusing the FBI of… er… stupidity that has necessitated Apple to intervene. “The FBI has admitted that a reset of the San Bernardino shooter's iCloud passcode was done with the agency's consent in the days following the terror attack, which left 14 people dead… Apple asserts that, had the passcode not been reset, the company would have been able to initiate a backup of the phone's data to its associated iCloud account in order to retrieve its contents. However, with the passcode on the phone no longer matching the one on iCloud, the only remaining option is the decryption of the phone itself.” The Wrap, February 21st.
This has become a political hot button, with powerful forces on both sides of the issues. Police authorities cry, how can the lives of human beings become subservient to some abstract notion of privacy and encryption, particularly when the relevant authorities have to jump through judicial hoops just be allowed access (the ‘safeguard’). Those on the other side point out how often and easily those judicial hoops have been circumvented by government officials not willing to make the effort… and the fact that secretive surveillance courts are contrary to everything we hold dear. Thank you Edward Snowden (who obviously backs Apple’s position). Opening this door is simply begging for governmental (and private industry) abuse, they assert.
“Within hours of the publication of Cook’s letter, it became clear that many in the public supported Apple. As Business Insider reports, a group called Fight for the Future organized a small rally outside the flagship Apple Store in San Francisco, in support of the company. Though the rally included just 30 people, the group plans to hold additional gatherings nationwide in the coming weeks, including in New York City, Boston, and Minneapolis.
“The first high-profile tech leader to voice support for Apple was Google CEO Sundar Pichai, who posted a series of tweets in which he called Cook’s letter ‘important’ and said the court ruling ‘could be a troubling precedent’ if carried out.
“Pichai’s tweets were followed by a Facebook post from WhatsApp CEO Jan Koum. ‘I have always admired Tim Cook for his stance on privacy and Apple's efforts to protect user data and couldn't agree more with everything said in their Customer Letter today,’ Koum wrote. ‘We must not allow this dangerous precedent to be set. Today our freedom and our liberty is at stake.’
“Mozilla executive director Mark Surman also put out a press statement expressing his support. ‘It sets a dangerous precedent that threatens consumers’ security going forward. Companies should be encouraged to aggressively strengthen the security of their products, rather than undermine that security,’ he wrote.” FastCompany.com, February 18th.
The bigger picture has to be the huge question of how much of what it means to be an American we are willing to give up in order to fight terrorism and cartel crime, now at levels never seen before in American history. How much more like them must we become to survive… and exactly what do become as a result. Not an easy question, but one we will be asked for the foreseeable future. And if we yield on these issues, do we move on to reject the other conventions that have defined us for years – like the Geneva Conventions. Where do we draw the line?
I’m Peter Dekom, and we must struggle and debate these issues fiercely, knowing that there are no easy answers.

No comments: