Soon, airport facial-recognition software to detect stressed-out tourists
By ANIThursday, September 16, 2010
WASHINGTON - With new advances in facial-recognition software, airport security workers might one day know with near certainty whether they’re looking at a stressed-out tourist or staring a terrorist in the eye.
A research team led by Dr. Alice O’Toole, a professor in The University of Texas at Dallas’ School of Behavioral and Brain Sciences, is evaluating how well these rapidly evolving recognition programs work.
The researchers are comparing the rates of success for the software to the rates for non-technological, but presumably “expert” human evaluation.
“The government is interested in spotting people who might pose a danger. But they also don’t want to have too many false alarms and detain people who are not real risks,” said O’Toole.
O’Toole is leading a team that is examining where facial-recognition algorithms succeed and where they come up short.
The researchers are carefully examining where the algorithms succeed and where they come up short. They’re using point-by-point comparisons to examine similarities in millions of faces captured within a database, and then comparing results to algorithm determinations.
In the studies, humans and algorithms decided whether pairs of face images, taken under different illumination conditions, were pictures of the same person or different people.
The UT Dallas researchers have worked with algorithms that match up still photos and are now moving into comparisons involving more challenging images, such as faces caught on video or photographs taken under poor lighting conditions.
“Many of the images that security people have to work with are not high-quality. They may be taken off closed-circuit television or other low-resolution equipment,” said O’Toole.
The study is likely to continue through several more phases, as more and better software programs are presented for review. So far, the results of man vs. machine have been a bit surprising, said O’Toole.
“In fact, the very best algorithms performed better than humans at identifying faces. Because most security applications rely primarily on human comparisons up until now, the results are encouraging about the prospect of using face recognition software in important environments,” she said.
The real success comes when the software is combined with human evaluation techniques, said O’Toole.
By using the software to spot potential high-risk individuals and then combining the software with the judgment of a person, nearly 100 percent of matching faces were identified, said O’Toole.
Next, using a test that spanned all false-alarm rates, the researchers compared the algorithms with humans of Caucasian and East Asian descent matching face identity in an identical stimulus set.
In this case, both algorithms performed better on the Caucasian faces, the “majority” race in the database.
The Caucasian face advantage was far larger for the Western algorithm than for the East Asian algorithm.
Humans showed the standard other-race effect for these faces, but showed more stable performance than the algorithms over changes in the race of the test faces.
These findings indicate that state-of-the-art face-recognition algorithms, like humans, struggle with “other-race face” recognition, said O’Toole.
The companies that develop the most reliable facial recognition software are likely to reap big profits down the line.
“Casinos have been some of the first users of face recognition software. They obviously want to be able to spot people who are counting cards and trying to cheat the casino,” said O’Toole.
The study will be published in ACM Transactions on Applied Perception. (ANI)