Thursday, February 26, 2009

Biometrics and Privacy

Since this has become such a big issue in California for privacy advocates in recent weeks (see my past posts on the subject), I think I'll continue to post expert opinions on the pro's and con's of biometrics. This will be particularly important to understand being that the Legislature will be debating whether to include facial and thumb print scans in all California licenses in the coming months.

There's no real consensus yet among privacy advocates as to whether ANY additional biometrics in licenses is acceptable, or whether, if implemented with complete transparency and ironclad privacy protections, some middle ground could be agreed upon. My sense is facial scans are simply unwarranted and intrusive. Period.

In fact, just last week a group of Vietnamese researchers showed that facial scans might not be such a good idea...cracking the system using multiple means. The simplest way was to use a picture of the person to spoof the web cam into thinking it was the user. Given the ready availability of images on sites like MySpace and Facebook, this seems to be an easy route to access.

The researchers also showed that they could use a brute force attack by generating multiple random fake faces to eventually gain access. States Profesor Duc in his paper on the hack, "The mechanisms used by those three vendors haven't met the security requirements needed by an authentication system, and they cannot wholly protect their users from being tampered."

With all that said, here's what Toby Stevens from Computer Weekly had to say:

I'm personally not too concerned about the application of biometric technologies in appropriate situations. What worries me are the processes and broader IT systems that depend on those technologies. Biometrics occasionally throw up false acceptances or false rejections. The problem is that the the systems and officials that depend on those biometrics, and the databases of personal information to which they are linked, place too much dependence on them and then make ridiculous decisions as a result. The attitude of "there's a biometric involved so it must be correct" is very dangerous indeed - ask the people who have suffered wrongful arrest, rendition and torture as a result of stupid decisions made on the back of biometric system errors (more on this in a forthcoming blog article).


The problem is that all too often the organisations implementing biometric systems have failed to be transparent about the purpose or operation of the system, and this has reinforced mistrust of the technologies. School implementations are once again an example, since local authorities have often refused to discuss details of their fingerprinting approaches, or even to seek valid consent to that use of personal information, believing it to be covered by statutory processing permissions.


Should we gather biometric templates or biometric images? The most complex and expensive part of a biometric scheme is enrolment of the data subjects into the system. Algorithms and technologies are developing quickly, and to protect the investment it is tempting to capture images (a high-quality scan of the biometric, eg a digital photo or high-quality voice recording) so that templates (mathematical products derived from that image, which can be used to confirm a biometric but cannot be used to recover the original image) can be regenerated when needed. However, 9 times out of 10 organisations go for the image option since they believe that this will future-proof their investment. Templates have fewer privacy implications than images, since a stolen image can (in theory) be used to assist in attacks on the user's identity, whilst the template is of far less use. Moreover, once a biometric image has been stolen and used for fraud, it can't be revoked - you can't change your fingerprints!

Not surprisingly, the answers to our key questions can be derived quickly, easily and with a minimum of cost. Every biometric application should have a Privacy Impact Assessment (PIA) as part of its business case, completed before any procurement or development commences. The PIA should consider whether biometric technologies are a proportionate and acceptable solution to the problem in hand; whether the application should seek to identify or authenticate the users; and if so, whether it is really necessary to capture an image at time of enrolment, or will a template alone deliver the necessary functions.

In many ways it was precisely the lack of this kind of analysis, rigorous study and ironclad safeguards demonstrated by the DMV last month that led our privacy coalition to take such aggressive action. Surely this issue, and technology, can be dealt with in a more transparent and meticulous way in the coming months.

No comments: