Posts Tagged 'encryption'

Biometrics are not the answer for authentication

I’ve pointed out before that biometrics are not a good path to follow to avoid the obvious and growing issues with authentication using passwords.

Many biometrics suffer from being easy to spoof: pictures of someone’s iris, appropriately embedded in a background, can fool iris readers, a sheet of clingfilm can often cause a fingerprint reader to ‘see’ the last real fingerprint used on it, and so on.

But there’s a more pervasive problem with biometrics. The fact that a biometric is something you are is, on the one hand, a positive because you don’t have to remember anything, and wherever you go, there you are.

But, on the other hand, a biometric cannot be changed, and this turns out to be a huge problem.

Suppose you go to authenticate using a biometric. The device that captures your biometric must convert it to something digital, and then compare that digital value to a previously recorded value associated with you.

There are two problems:

  1. For a while, the device has your biometric data as plaintext. It may be encrypted very close to the place where it is captured, but there is a gap, and the unencrypted version can potentially be grabbed in the gap. There is always a temptation/pressure to use low-power sensors for capture, and they may not be able to handle the encryption.
  2. The previously recorded values must be kept somewhere. If this location can be hacked, then the encrypted versions of the biometric can be copied. These encrypted versions can then be used for replay attacks.

Of course, there are defences. But, for example, if e-passports are to be used to enter multiple countries, then they must use the same repertoire of encryption techniques so that passports from multiple countries can be read by the same system. So it’s not enough to say that different encryptions of biometric plaintext to its encrypted versions will prevent these issues.

And if one person’s encrypted biometric is stolen, there’s no practical way to update the system’s that rely on it (since they must continue to use the same mapping so that everyone else’s biometrics will still work). More importantly, there’s no way to issue a fresh identity for the person whose data was stolen (“Go and have plastic surgery so that we can restore your use of facial recognition”).


Come back King Canute, all is forgiven

You will remember that King Canute held a demonstration in which he showed his courtiers that he did not have the power to hold back the tide.

Senior officials in Washington desperately need courtiers who will show them, with equal force, that encryption has the same sort of property. If it’s done right, encrypted material can’t be decrypted by fiat. And any backdoor to the encryption process can’t be made available only to the good guys.

The current story about Apple and the encrypted phone used by one of the San Bernadino terrorists is not helping to make this issue any clearer to government, largely because the media coverage is so muddled that nobody could be blamed for missing the point.

The basic facts seem to be these: the phone is encrypted, the FBI have been trying to get in to it for some time, and there’s no way for anyone, Apple included, to burn through the encryption without the password. This is all as it was designed to be.

The FBI is now asking Apple to alter the access control software so that, for example, the ten-try limit on password guesses is disabled. Apple is refusing on two grounds. First, this amounts to the government compelling them to construct something, a form of conscription that is illegal (presumably the FBI could contract with Apple to build the required software but presumably Apple has no appetite for this).

Second, Apple argues that the existence proof of such a construct would make it impossible for them to resist the same request from other governments, where the intent might be less benign. This is an interesting argument. On the one hand, if they can build it now, they can build it then, and nobody’s claiming that the required construct is impossible. On the other hand, there’s no question that being able to do something in the abstract is psychologically quite different from having done it.

But it does seem as if Apple is using its refusal as a marketing tool for its high-mindedness and pro-privacy stance. Public opinion might have an effect if only the public could work out what the issues are — but the media have such a tenuous grasp that every story I saw today guaranteed greater levels of confusion.