FBI using facial recognition despite privacy concerns
For 19 years, Lynn Cozart eluded authorities after being convicted of sexually assaulting his three children.
He failed to show up for his sentencing hearing and seemed to drop off the map. So in a desperate bid to track down the Pennsylvania native, an FBI agent submitted Cozart's mug shot to the agency's newly created Next Generation Identification (NGI) system, which among other things uses facial recognition software to identify suspects.
Facial recognition software captures an image of an individual's face from a photo or video and identifies him or her by matching the picture with one on a database of faces.
After Cozart's arrest photo was sent out to several state agencies, Arkansas' motor vehicle department came back last month with a match from their collection of driver's license photos. The convicted child molester was then tracked down to Oklahoma, where he was found working at a Walmart under an assumed named and subsequently arrested.
"You take a case that had a 19 year gap, or the guy was on the run for 19 years," said Stephen L. Morris, the assistant director of the FBI's Criminal Justice Information Services (CJIS) Division, which includes NGI.
"Technology did result in the identification of that guy because it happened to provide them a lead they were able to run down in Oklahoma," he told CBS News. "When the task force in Oklahoma started running it down, they were able to verify the individual under a different name was one in the same as the individual working in Walmart."
The NGI system is the FBI's ambitious effort to bring its databases of tens of million mug shots, fingerprints and other data on criminals into the 21st century.
The system, which was designed by Lockheed Martin at a reported cost of $1 billion and went live in September, allows police departments around the country to submit photos of all sorts and videos and get a response back in two hours based on the facial recognition technology. And as the Cozart case shows, the FBI can go well beyond that database to access photos kept with a range of state and federal agencies - including the State Department, which keeps a vast array of passport and visa photos.
The NGI system contains nearly 125 million criminal and civilian fingerprints and 24 million mug shots. As the technology matures, the FBI said it has the capability to analyze everything from a suspect's scars and tattoos to their voices to their eyes - remember the iris scanners in the movie "Minority Report."
The expanded reach of the system has alarmed some privacy advocates, who fear the power of the technology could lead to abuses like mass surveillance or tracking innocent people.
The Electronic Frontier Foundation, which describes itself as defending your rights in the digital world, sued the FBI in 2013 to "shine light on the program and its face-recognition components."
"NGI will result in a massive expansion of government data collection for both criminal and noncriminal purposes," EFF Staff Attorney Jennifer Lynch, who testified before the U.S. Senate on the privacy implications of facial recognition technology in 2012, said in a statement last year. "Biometrics programs present critical threats to civil liberties and privacy. Face-recognition technology is among the most alarming new developments, because Americans cannot easily take precautions against the covert, remote, and mass capture of their images."
Jay Stanley, a senior policy analyst at the American Civil Liberties Union, told CBS News he also saw potential risks with the system given the lack of adequate privacy safeguards and growing power of national security agencies since the 9-11 terror attacks.
"Nobody has any objection to the FBI identifying an unknown perpetrator caught on a video and catching that person if they committed a crime," Stanley said.
"But compared to fingerprints, facial recognition as a biometric is very susceptible to abuse because it can be applied to a person without their knowledge, let alone their permission or participation," he said. "There is a lot of potential for facial recognition databases to be applied for mass surveillance, for identifying people in a context where they don't expect or want to be identified."
Morris downplayed those concerns and said the system was simply a faster way of doing what has been done since "law enforcement was a profession" in terms of collecting and storing information.
He also said they were only "leveraging technology" that many people already come in contact with on a daily basis. Facebook and other social networking sites rely on facial recognition technology and it is increasingly being deployed for marketing and security purposes by malls, casinos and even churches.
"Taxpayers and U.S. citizens have an expectation that we will try to leverage modern day technology to make our processes more efficient and that is essentially what we have done," Morris said.
"We are using a technology that is widely available to help us do our job," he continued. "Rather than us looking through 24 million photographs to find our subject, we have automated that process and we are leveraging our technology to help us narrow down those suspects."
Morris insisted the searches were not violating anyone's privacy and that there were several safeguards in place to ensure abuses didn't occur.
Searches return several potential matches - from two to as many as 50 - to eliminate the risk of false positives and any match must be followed up by traditional detective work that rules out certain candidates and finds the right match.
"We don't make positive identification. The technology is just not there yet to say it is reliable to make a positive identification like you would have with fingerprints," he said. "We have to be absolutely certain who we think we have in the photo. It's more than just hunch or more than just a guess. We have to make a calculated guess and that is why these responses go back as candidates."
Morris also said there were also limits on how the system could be used, noting that the request has to be made from somewhere like a police station, since any gallery returned has to be examined by a facial recognition specialist before a match can be confirmed.
"Clearly, that is not going to happen on the side of the street. It's not going to happen in sub-minute time frames," he said. "There is investigative work that has to be done when the agency gets its results back."
Michigan State University's Anil Jain, who specializes in pattern recognition, computer vision and biometric recognition, said the the technology has advanced to the point where the numbers of false negatives and false positives have dropped dramatically.
But he also warned there were still significant limits to the technology - especially when a person's face is covered. Facebook has figured that one out - note their algorithm that can identify a person from the clothes they wear and their hair styles - but it has yet to migrate to law enforcement.
To demonstrate this, Jain and several colleagues showed that face recognition could have been deployed to identify one of the Boston Marathon bombers, Dzhokhar Tsarnaev, but not his older brother, Tamerlan, because sunglasses shielded his face. In a subsequent study, Jain and his co-authors showed that when a sketch artist drew a composite of the older brother's face based on video frames, researchers were only able to narrow the choices down to the top 100 best-matching images from a database of 1 million.
His message: Don't expect the technology to replace good old fashioned police work anytime soon.
"Over the last 20 years, facial recognition technology has improved significantly to the point that mug shot photos can be easily matched with other mug shot photos," he said. "But if the amount of the face visible is very small, there is no hope for facial recognition. Then, we have to rely on other information. If there scar on cheek or tattoo on arm or neck those things start playing an important role."