Software capable of recognizing faces is not a particularly new technology. U.S. military and intelligence agencies have used it for years to identify possible terrorists. Now, however, a variety of companies, large and small, are marketing the technology both to U.S. law enforcement agencies and to schools.
Wider adoption of facial recognition technology may have been inevitable, but how effective is it — or can it be — in stopping attacks like those against schools in Newtown, Connecticut, or Parkland, Florida? Evidence of its effectiveness is, so far, pretty thin.
Partly that’s due to shortcomings in the technology itself. Last February The New York Times reported on research at MIT’s Media Lab that illustrated how facial recognition software made by Microsoft, IBM and a small firm named Face++ are far likelier to misidentify even the gender of a black woman than a white man.
The software misidentified just 1% of 385 photos of white men but incorrectly identified 35% of 271 photos of darker-skinned women. Because current artificial intelligence (AI) algorithms “learn” based primarily on quantity of data, a system trained on lots of photos of white men will be much worse at identifying black women.
That’s just one issue. The Washington Post on Friday reports on companies that market their facial recognition programs to schools as a way to help prevent mass shootings at the school. One firm, Israel-based AnyVision, even uses images of the shootings at Sandy Hook Elementary School and Marjory Stoneman Douglas High School in its presentations to school officials.
Washington Post reporter Drew Harwell writes:
AnyVision chief executive Eylon Etshtein says it’s obvious why schools would want the technology: If a kid arrives one day “wearing all black and carrying a big bag, you’re probably going to want to know what the kid is doing and what’s inside the bag.”
But some experts were skeptical of how well it would work. “Teenagers are anomalies,” [Andrew] Ferguson, [law] professor [at the University of the District of Columbia], said. “Is it suddenly going to be suspicious that a teenager dyed their hair or looks depressed?”
For schools with surveillance cameras already installed, adding facial recognition technology can be inexpensive and require no more effort than installing a new piece of software. Even if a school has to buy and install the cameras, a facial recognition system may be a less controversial step than arming teachers.
What about privacy issues? The CEO of AI video system company BriefCam told the post, “I don’t hear them raised. Safety and security trumps those concerns.”
So far there is no report of a potential shooter who has been identified by a school’s facial recognition system. Even if one had been, the school would need a plan for how physically to deal with the situation. Facial recognition can’t stop bullets.