Security photo
SHARE

A Massachusetts man found himself trying to prove his identity this spring after a facial recognition system pegged his driver’s license as a fake. The problem: He wasn’t using a fake license. He merely looked like another driver.

John H. Gass of Needham, Mass., got a letter in the mail this spring informing him that his license had been revoked, according to a report in the Boston Globe. The system is designed to track down fake IDs, flagging people who look similar to other motorists in the database. But Gass’ license was legit — he just happened to share similar facial features with another of Massachusetts’ 4.5 million drivers. Gass won a hearing and was able to prove his identity within a couple weeks, allowing him to drive again.

The saga outlines the key problem in using facial recognition tech for law enforcement purposes. False positives are inevitable — no system is perfect — but what happens when a false positive impacts someone’s driving record, criminal history or other sensitive information?

The system that made such a mess of Gass’ case examines each driver’s license picture stored in the state’s database, mapping facial data points and comparing images to others. The software flags licenses with similar-looking photographs, and then DMV officials check drivers’ information to sort it all out, the Globe explains.

Massachusetts bought the system with a $1.5 million grant from the Department of Homeland Security. At least 34 states use such systems, which law enforcement officials say help prevent identity theft and ID fraud. Last year, Massachusetts State Police obtained 100 arrest warrants for fraudulent identity, and 1,860 licenses were revoked because of the software, according to the Globe.

Last week, we told you about plans to deploy the Mobile Offender Recognition and Identification System (MORIS) in police stations throughout the country, where officers can use an augmented iPhone to snap pictures of people and compare their images to a database.

The controversy surrounding that system has centered on privacy — should cops need a warrant to be able to take pictures of people in a public space for the purposes of criminal identification? But the DMV facial-recog story should be another warning. What happens when someone gets arrested simply because an algorithm determines he looks like Undesirable No. 1?

Boston.com