Local

New TSA security tech raises concern about the potential of misidentifying minority travelers

The future of airport security is getting more high-tech as the TSA continues testing a system that scans your face during the screening process.

The pilot program with facial recognition technology is running at several airports nationwide.

After scanning your ID, the TSA says this new system takes your photo and verifies your information. The agency believes this will enhance security and cut wait times.

An agency official said if the technology thinks you’re someone else, a security worker would manually check your ID.

“To me just invites, further law enforcement scrutiny invites further contact with authorities in ways that are not going to be helpful,” said Vincent Southerland, NYU Assistant Professor of Clinical Law.

Southerland helps run the Center on Race, Inequality, and the Law at NYU. He believes an incorrect match could be problematic for minority travelers.

“Force the person to kind of have to prove or demonstrate their identity in weird ways that they otherwise would not have and that is going to course lead to conflict,” said Southerland. “And conflict often does not end very well for folks who have been traditionally marginalized, oppressed and targeted by law enforcement.”

It’s also happened before with this facial recognition technology.

Take the case of Robert Williams - he’s a black man who the ACLU says was wrongfully arrested by Detroit Police in 2020. The organization said facial recognition software incorrectly identified him as a shoplifting suspect. The ACLU says the charges were eventually dropped.

In some cases, this software compares your image to another one or an entire database of photos.

So what happens when travelers don’t look exactly like their ID?

A 2019 federal government study by The National Institute of Standards and Technology (NIST) echoes some of those concerns. It found that Asian and African Americans were up to 100 times more likely to be misidentified than White people, depending on the algorithm and the search.

“The bigger finding from the 2019 study was that the false positive rates where somebody else could use your passport or somebody else could access your phone very much, much more widely,” said Patrick Grother, National Institute of Standards and Technology (NIST).

Grother is one of the computer scientists who worked on this study. He said they evaluated hundreds of algorithms and reviewed the rate of false negative and positive matches.

“One of the mistakes they’ll make is a false negative mistake, where it doesn’t associate you as you two photos of you. And it doesn’t, it doesn’t put them together,” said Grother. “The other mistake is where it takes two photos of me and somebody else and says it’s the same person. So that’s a false positive error.

Grother said these results went directly to the developers.

“We continue to track the technology and whether it’s improving with respect to accuracy and with respect to these demographic effects,” said Grother.

TSA declined our request for an on-camera interview about these racial disparities within facial recognition technology, so the Washington News Bureau took those questions directly to the White House.

“The TSA continuously evaluates and improves upon technology and process to better protect the traveling public and ensure that screening is conducted in a manner that respects the dignity of each individual that is clearly a priority,” said White Press Secretary Karine Jean-Pierre. “DHS is working NIST to access the performance of face recognition technology and reduce demographic differentials particularly as it relates to race and gender.”

The Biden Administration says the Department of Homeland Security will continue to evaluate and improve the system but Southerland still has concerns.

“By rolling these types of tools out when they have these problems that are that are baked into them, you’re almost normalizing the types of harms that we that we anticipate seeing,” he said.

Southerland says the price of these problems is high especially if an incorrect match leads to a criminal accusation.

“You’re going to be separated from your family, separated from your loved ones, lose opportunities for housing, employment, mental health and regular health care,” said Southerland. “All the range of consequences that flow from criminal justice involvement are almost compounded by having separate technologies or tools in the hands of law enforcement.”

The TSA has not finalized when facial recognition will rollout nationwide. The agency says the new system is performing well, and found no major differences across gender, race and skin tone.

The TSA said any incorrect matches will be reported back to the agency for review.

The agency also sent the Boston 25′s Washington News Bureau this statement about the technology:

TSA is exploring the use of one to one and one to few facial identification to automate identity verification at airport checkpoints and modernize the screening experience for passengers. Biometric technology has the potential to enhance security effectiveness, improve operational efficiency, and yield a more streamlined passenger experience at the TSA checkpoint. TSA recognizes that biometric solutions must be highly usable for all passengers and operators, considering the diversity of the traveling public.

These pilots are entirely voluntary. While we are informed the preliminary results are encouraging, TSA continues to monitor these pilots to ensure there is no inherent bias in the technology.


This is a developing story. Check back for updates as more information becomes available.

Download the FREE Boston 25 News app for breaking news alerts.

Follow Boston 25 News on Facebook and Twitter. | Watch Boston 25 News NOW

0