Face recognition and tracking people

Content warning for mention of possible use of publicly available tracking technology by stalkers and harassers.


The New York Times published an article last week about creating a face-recognition system using publicly available camera feeds:

To demonstrate how easy it is to track people without their knowledge, we collected public images of people who worked near Bryant Park (available on their employers’ websites, for the most part) and ran one day of footage through Amazon’s commercial facial recognition service. Our system detected 2,750 faces from a nine-hour period (not necessarily unique people, since a person could be captured in multiple frames). It returned several possible identifications, including one frame matched to a head shot of […] a professor at the SUNY College of Optometry, with an 89 percent similarity score. The total cost: about $60.

(They received permission from the professor in question to talk about him in the article.)

The article discusses some of the potential for abuse of face-recognition technology by law enforcement:

Matt Wood, the general manager of artificial intelligence for Amazon Web Services, noted that it is possible that Rekognition, like other types of information available to law enforcement officials, could be used inappropriately. […] He added that the company has not received any reports of misuse by law enforcement.

In January, however, the A.C.L.U. sent a letter to Amazon asking it to stop selling facial recognition technology to police and government agencies, saying that the company’s attention to civil liberties has lagged behind that of Google and Microsoft.

“Rekognition marketing materials read like a user manual for authoritarian surveillance,” said Nicole Ozer, the technology and civil liberties director for the A.C.L.U. of California, in a statement last year.

But the article doesn’t talk about some other aspects of the situation. For example:

Sousveillance
The article is focused primarily on the possible misuse of systems like these by governments, but it starts out talking about how easily a non-government-associated person can use the system. The fact that these feeds are publicly available makes me think of sousveillance and David Brin’s related Transparent Society ideas; in particular, the article doesn’t mention that such feeds could potentially be used by citizens to observe police activity.
Stalking and harassment
Also not mentioned by the article, also in the nongovernmental-use category: this kind of technology could be a huge and awful aid to stalkers and harassers. :(
Attempts to defeat face recognition
The article also doesn’t address the ongoing struggle between the creators of face-recognition systems and the creators of face-disguising systems. More on that below.

The problems around law-enforcement access to surveillance and face-recognition systems have been pretty widely recognized for years, and people have taken various approaches to try to avoid being recognized by such systems. Here are a few links on that topic.

  • In 2014, Robinson Meyer of the Atlantic tried out the face-camouflage approach recommended by CV Dazzle. Meyer wrote an article about wearing face-camouflage in public for a few days. His article mentions, but doesn’t go far enough in discussing, the ways in which his being a well-dressed 20something white man influenced his and other people’s perceptions (he seems very surprised to learn that “looking strange contorts public trust in weird ways,” and doesn’t seem aware that (for example) quite a lot of people who aren’t wearing face camouflage look “strange” to many white people); but even so, I thought the article was an interesting look at face-camouflage issues.
  • CV Dazzle’s website gives some further info.
  • A 2018 article talks about some other related approaches to face camouflage, including scarves. “The scarf is part of Hyphen-Labs’ wider NeuroSpeculative AfroFeminism project, which also includes earrings embedded with cameras and an Afrofuturist ‘neurocosmetology’ salon, all angled toward exploring the roles and representations of black women in technology. When it comes to facial recognition, the scarf is a way to draw attention to the creeping growth of new surveillance techniques in Western society and how these mechanisms are imperfect, built from flawed algorithms that can can be fooled, sidestepped, hijacked.”
  • However, a 2017 article notes that with some help, face-recognition technology can recognize faces even when they’re wearing some kinds of masks, by looking at the pattern of a set of key points of a given face. I’m guessing that means things like distance between eyes. (Though both this and the previous article mention specially designed glasses frames that I’m assuming give false impressions of such parameters.)
  • Much of my post here focuses on the dangers of accurate face recognition. But one more aspect of all of this, of course, is that a lot of face recognition is very inaccurate, which can lead to (among other problems) police misidentifying people. For more on that, see a 2018 article from the Independent: “Metropolitan Police's facial recognition technology 98% inaccurate.”

Join the Conversation