Is iPhone Fingerprint Security Secure At All? (Op-Ed) Credit: Laptopmag.com

Kevin O'Brien, currently working with the cloud security vendor CloudLock Inc. as an enterprise security architect, has been part of the security community for more than a decade. O'Brien contributed this article to Tom's Guide's Expert Voices: Op-Ed & Insights.

With iOS 7 released this week, the security community has been reflecting on the impact of Apple's new fingerprint authentication mechanism that comes as part of the iPhone 5s' hardware, weighing the consequences for both data integrity and personal privacy that ubiquitous biometrics will bring.

Like many in the industry, I watched Apple present the technology on a darkened stage earlier this month, and as an outsize digit appeared behind him on the presentation screen, I listened to Phil Shiller, senior vice president of worldwide marketing at Apple, make the pitch for scanning each user's finger as a login mechanism. Apple's core message was not complicated: More than half of the mobile users they surveyed admitted to not using any type of lock code on their devices, citing inconvenience as the reason for allowing their data to go unsecured should they lose direct physical control of their phones. By integrating a scanner into the primary button on the device, Cupertino's designers planned to enhance mobile security for those users, leveraging what Morelli described as "a key we have with us everywhere we go."

There is, however, reason to pause and consider exactly what that means. Two of the most important questions raised during the keynote address (and that are still unanswered at this point, even as this data is being collected and used) are the following: First, can Apple's claim that the biometric data is being stored in an indirect way and in secure storage on the new phones be trusted, and second, are the company's motivations for implementing the technology as noble as it claims? On both accounts, I have my doubts.

The fallacy of physical security

Some years ago, I was sitting in a coffee shop in Kendall Square, in Cambridge, Mass., looking at a hardware security token that was supposedly unhackable. Across the table, a hacker from the sadly long-gone @stake research and development team was taking a group of us through his findings on how easy it was to gain access to the high-value encrypted data on the device.

USB security keys were new in 2000, and one of the early ideas for their application was to store cryptographic and sensitive data on them, allowing the information to be removed from a computer when not in use, effectively air-gapping a person or organization's most important information from the Internet. In this case, the device manufacturer had built in a number of tamper-proofing features, such as coating the chip in a special epoxy to prevent hackers from identifying it, as well as a robust set of software controls that supposedly made it impossible for a hacker to gain unprivileged viewership permissions of the cryptographic data stored on the chip.

A hobby knife and heat gun bypassed the tamper proofing; a quick dip in some chemicals from a high-school chemistry set removed the epoxy coating without causing any damage to the hardware beneath it. The software defenses — a hashed and obfuscated copy of the administrative credentials used to protect the critical data — were similarly trivial to bypass. In a few minutes, with a combination of hardware and software available for less than $20, the entire file system was decrypted and exported.

MORE:13 Security and Privacy Tips for the Truly Paranoid

The lesson from that late-summer afternoon has stuck with me since, across different domains in the security industry, and it still informs my response to countless promises from hardware vendors and experts touting some new "uncrackable" device: Given physical access, no security will suffice against a dedicated and skilled attacker.

This is a weakness in the iPhone defense strategy that Apple cannot easily dismiss. In suggesting that users' fingerprint data will be secure because it will be stored in an encrypted hash on the processor, Apple is opening itself up to a tremendous number of potential weaknesses, from poor implementations to weak entropy pools to simple encryption bypass attacks.

While the specifics of where the cracks in this type of defense will emerge require additional time, research and attention, Apple's logic relies upon an argument that has never been true. Especially with the physical hardware readily available, it is highly likely that any and all hardware and software vulnerabilities will be found and exploited in short order.

Of course, it may be the case that the sly hacker will never need to crack open the case, or find a flaw in the phone's A7 chip; this supposedly secure data may very well fall to simple social engineering. In July, Juniper Networks published a report noting a 614 percent increase since 2012 in the number of malware apps on the mobile market — largely, but not exclusively, targeting Android devices — but as the value of the data on Apple's devices increases, it may well be that an app will simply find a way to exploit the hardware or even convince a user to provide it openly. One can easily imagine a maliciously coded video game, just legitimate enough to slip under the walls of Apple's App Store, that, in practice, can read fingerprints directly and shuffle them off to parts unknown.

Cui bono, Cupertino?

The second, and perhaps more interesting, line of questioning is whether Apple's case for using fingerprints for authentication is as it appears to be.

Discussions of security and privacy have changed over the past few months. While this sort of information once would have entailed a theoretical or academic investigation into reversing cryptographic hashes or exploiting insecure chip design, it must now be viewed through the lenses of both political will and corporate credibility.

Last month's most recent revelations from the ongoing breach of U.S. National Security Agency (NSA) files by Edward Snowden described how the agency was collecting data on vast numbers of different media, primarily through privileged telecommunication network access. Therefore, many supposedly secure mechanics are intentionally compromised, presumably under order from the United States' most secretive intelligence agency. Chatter among the security literati has suggested that even the most powerful consumer cryptographic hardware has been outright bypassed by NSA order. Apple's claims may be that it isn't like Intel, but at the risk of encamping with the tinfoil crowd, couldn't the company be lying, and being forced to do so under penalty of law? Worse, can anyone really be certain that Apple hasn't been organizationally compromised, its device security faulty unbeknownst even to its own internal design teams?

One must be cautious about seeing threats in every shadow. However, extending an argument about potential risk to its extremes can highlight fundamental questions about what is at stake. We know now, beyond any doubt, that both incredibly well-funded government agencies and a myriad of hackers, each with their own agendas and ideologies, are laying siege to any and all online data. To paraphrase from pop culture, if you put it out there, they will come.

So why is this data there to begin with? Is the supposed benefit — more secure cellphones that can't be accessed easily by common thieves — worth the cost? There is an old maxim in security, attributed to security technologist Bruce Schneier himself, that seems apropos here:

"The only secure computer in the world is unplugged, encased in concrete and buried underground — and even that one might be vulnerable."

Apple claims that its use of fingerprints will be secure; users' biometric data will be stored in a secured chip, and only in hashed form. It sounds good, but each person only gets a single set of fingerprints in his or her lifetime. No one knows what the future holds for the use of fingerprints as authentication tokens, but using them for relatively low-value devices' security seems ill-advised — no matter how ardently the device manufacturer argues for the unassailable defenses that are in place, this information is simply less secure by merit of being present on consumer-grade iPhones — subject to so many different attacks, weakness and manipulations — than if it were absent.

If the concern were that users were not using a passphrase — if Apple had decided to become the champion of iPhone security — why not simply enforce a login code on all devices, rather than using deeply personal biometrics to access a comparatively vulnerable environment?

Who really benefits from this innovation?

This story was provided by Tom's Guide, a sister site to BusinessNewsDaily. The views expressed are those of the author and do not necessarily reflect the views of the publisher. This version of the article was originally published on Tom's Guide.