New AI security measure at Mall of America raises concerns for some
Go Deeper.
Create an account or log in to save stories.
Like this?
Thanks for liking this story! We have added it to a list of your favorite stories.
The Mall of America recently launched new facial recognition technology to help identify people who could pose a security risk to mall patrons or businesses.
Facial recognition is a kind of artificial intelligence that can identify a person by measuring their facial features and matching them with other images.
Mall of America officials say the system relies on technology from Corsight, a private Israeli company, and is integrated with security cameras located in key areas to identify people of interest — including those who are prohibited from being in the building, or those who have been reported as missing or who may be in danger.
Mall officials say it’s been tested by the National Institute of Technology (NIST) and Homeland Security and has a 99.3 percent accuracy rate.
Turn Up Your Support
MPR News helps you turn down the noise and build shared understanding. Turn up your support for this public resource and keep trusted journalism accessible to all.
But that number doesn’t mean much to Clare Garvie — a training and resource counsel with the National Association of Criminal Defense Lawyers.
“Facial recognition, reliability or accuracy cannot be accurately described by a single number. There are a number of different reliability metrics that are really important to understanding how accurate or how reliable, or in other words, how susceptible to error, a facial recognition program is,” Garvie said.
Ravi Bapna is a professor of business analytics and information systems at the University of Minnesota Carlson School of Management.
The mall hired Bapna to vet the system — which as he explains — examines already-existing photos and searches for matches.
If no match is found, he says the data isn’t stored. Bapna thinks the technology can help keep the mall safer and has become more reliable over the years, but it’s not perfect.
“It could either make two types of mistakes: There could be a false negative. Somebody who’s not supposed to be in the mall, walks in and the system misses this person so you can think of that as a miss,” Bapna said. “Or it could be a false positive, which is somebody who’s not essentially in the database, but is classified as such.”
False positives are a concern for Munira Mohamed. She’s a policy associate with the American Civil Liberties Union of Minnesota and says she’s worried the technology will misidentify and target people of color.
“So the first issue that we have is that there’s problems with accuracy. It’s not always guaranteed that the technology will work. And we’ve seen essentially really volatile accuracy rates, depending on how you look,” Mohamed said.
The National Institute of Standards and Technology found higher rates of false positives for women, and Asian and African Americans compared to their white peers.
The new technology is one of many security measures that are visible or not visible to visitors.
Mall officials were not available for an interview. In a statement, a spokesperson said the Mall “understands the importance of navigating between security and privacy.” And they said their team doesn’t act on alerts from the technology alone.
“Mall of America’s policy specifically prohibits our security team from approaching a potential match based solely on the technology,” reads the statement. “When the system alerts to a potential POI [person of interest] match, our security team conducts a thorough investigation including a traditional facial recognition check with up to three layers of human visual review comparing the photo of the POI to the individual identified as a potential POI.”
Mall officials also said the technology does not identify or store facial data for anyone who is not a person of interest.
Bloomington Police Chief Booker Hodges says facial recognition technology can be beneficial and he’s not overly concerned that it will be used to racially profile people at the mall.
But he says his officers won’t arrest someone at the mall based solely on facial recognition.
“Without the human checks and balances, I think it is flawed and I think it’s inappropriate to use that technology to develop probable cause arrest without human verification behind it,” Hodges said.
The police department currently does not use facial recognition technology according to Hodges and does not have access to the mall’s database, unless the mall shares specific information.
After the mall announced the use of facial recognition technology, two state senators shared statements opposing the mall’s use of the system.
Sen. Omar Fateh, DFL-Minneapolis and Sen. Eric Lucero, R-Dayton, cited concerns about privacy and data storage.
“Public policy concerns surrounding privacy rights and facial recognition technologies have yet to be resolved, including the high risks of abuse, data breaches, identity theft, liability and accountability,” reads a statement from Lucero.
Fateh shared a similar sentiment in his statement, “Even in cases where the system does identify someone correctly, it is not yet clear how that data will be stored, distributed or protected from data breaches.”
He also said legislators should consider action to regulate or ban the new technology.
“We hope to act on these concerns as soon as possible, including if a special session is called this year,” Fateh said.