Blending in, photo by @turutututuu

A Primer for Facial Recognition Technologies

A summary of the primer by the Algorithmic Justice League

Alex Moltzau

--

I recently read a document that summarised well some of the current challenges within facial recognition technology. It was a short and concise document, still I thought to myself it might be useful to make a shorter summary.

Facial Recognition Technologies: A Primer provides a basic introduction to the terminology, applications, and difficulties of evaluating these complex technologies.

This primer is meant to accompany our white paper, Facial Recognition Technologies in the Wild: A Call for a Federal Office

The Algorithmic Justice League combine art and research to illuminate the social implications and harms of AI.

The Algorithmic Justice League aims to:

  1. highlight algorithmic bias through provocative media and interactive exhibitions
  2. provide space for people to voice concerns and experiences with coded discrimination
  3. develop practices for accountability during the design, development, and deployment phases of coded systems.

So what is facial recognition technology or FTR for short?

They define Facial Recognition Technologies (FRTs) to be a set of digital tools used to perform tasks on images or videos of human faces.

These tools can be grouped into three broad categories depending upon the question they answer.

  1. Is there a face in the image?
  2. What kind of face is shown in the image?
  3. Whose face is shown in the image?

Face detection. Face detection is the process of detecting the presence of faces and 1 locating those faces in an image or video (see Figure 1).

Software can be developed to assess the attributes of a person from their face.

Face attribute classification: when these attributes have been separated into distinct categories, such as gender, race, or ethnicity, this may be called face attribute classification.

Face attribute estimation: when the attribute is a number, like an age, the term face attribute estimation is more appropriate.

Face attribute detection: software to detect and locate accessories like glasses and scarves or face attributes like beards or moustaches.

Emotion, affect, and facial expression classification. Facial recognition technologies can be used to classify facial expressions, such as “smile,” “frown,” or “scowl.” They can also be used for the closely related problem of inferring the emotional state or affect of a person, such as “happy,” “sad,” or “angry.”

“It is important to keep in mind that many systems that claim to do emotion recognition have really been developed to recognize specific facial expressions (as performed by paid actors), not to detect the subtle cues that may reveal a person’s underlying emotional state.”

Two subtly different types of recognition, referred to as:

  1. Face verification: attempts to determine whether an image shows a particular person. For example, software on a cell phone may try to answer the question, “Can it be verified that the camera shows the phone’s owner?” A query image is deemed to be either a match, if it appears to show the owner, or a mismatch otherwise.
  2. Face identification: attempts to answer the question, “Whose face is this?” Face identification software can only match the image of a face to a person for whom it already has some appearance information. The set of people for whom an application has stored appearance information is called the gallery. Simply put, this is the set of people that a face identification system could possibly identify. A typical example of a gallery would be the set of people who work in a secured location, such as a private office building.

Where is FTR used?

The short report mentions that FTR is being used in several place already.

  • Banks.
  • Consumer Products.
  • Events.
  • Housing.
  • Police Departments.
  • Places of Worship.
  • Schools.
  • Stores.
  • Transportation.
  • Workplaces.

As such there is now a wide application, at least in the US, that can necessitate some reflection on the components of the technology. They list the following five components as important:

  1. Capture and detection.
  2. Enrolment.
  3. The digital representation of a face.
  4. Comparison.
  5. Matching decision.

So then, what is the possible result?

  • True positive (or true match). In face verification (1-to-1 comparison), a true positive (or true match) occurs if a query image matches a specific identity in a 1-to-1 comparison.
  • True negative (or true mismatch). In addition to verifying and identifying a unique individual, systems should also correctly reject faces that do not match.
  • False positive (or false match). A false positive means the wrong person is deemed to be a match. Depending on the application, the consequences of such an incorrect decision can vary.
  • False negative (or false mismatch). Rejecting the correct person results in a false negative outcome (or false mismatch). For facial verification used for fraud detection, a false negative can mean an individual is denied access to a service or opportunity.

How accurate is FTR?

There are different way to evaluate this and the report lists a few.

  • Performance metrics and benchmarks.
  • Real-world performance and benchmark results.

“…seemingly small error rates can still have a negative impact on a substantial number of individuals.”

With a 1 in 500 error rate for example:

“…a working population of 2 million people, this would result in approximately 4, 000 false matches per day.”

They argue an important question remains:

“…what are effective alternatives to using benchmarks and metrics in order to decide if a specific facial recognition technology is appropriate for deployment for a particular application in a targeted population?”

There is a need for questions that go beyond accuracy and technical considerations.

A wide range of issues need to be deal with, such as:

  • Harmful discrimination.
  • Privacy.
  • Consent.
  • Legality.

“In some cases, in certain contexts or for particular applications, the use of FRTs will not be justified regardless of accuracy.”

A sign at a black lives matter protest in Atlanta, photo by @mcoswalt

Therefore oversight or regulation might otherwise be considered.

They explore this further in an accompanying white paper called Facial Recognition Technologies in the Wild: A Call for a Federal Office.

I would recommend reading both texts in full, however I hope this short summary was helpful to spark your interest.

This is #500daysofAI and you are reading article 397. I am writing one new article about or related to artificial intelligence every day for 500 days.

--

--

Alex Moltzau
Alex Moltzau

Written by Alex Moltzau

Policy Officer at the European AI Office in the European Commission. This is a personal Blog and not the views of the European Commission.

No responses yet