The face is also a key these days, security gates at airports and office complexes open automatically when the right one is presented to them. Special algorithms have learned to match faces with one another based on the vast amount of photos. With the spread of Covid-19, however, the booming facial recognition technology industry had to learn for itself. Because suddenly a lot of people wore protective masks on which the algorithms initially failed.

Anyone who distrusts government and commercial surveillance is happy about this side effect of the pandemic. A study by the US National Institute of Standards and Technology (NIST), which is part of the US Department of Commerce, shows that many algorithms have learned and recognize faces with masks much better than at the beginning of the pandemic.


The experts examined 152 algorithms that were developed by traditional universities such as that of the Portuguese city of Coimbra, tech companies such as Acer from Taiwan or the Chinese AI company Sensetime. 65 of these algorithms were only made available to the NIST examiners after mid-March of this year, i.e. after the pandemic had fully broken out. The so-called FRV test is the gold standard when it comes to assessing the accuracy of facial recognition systems. NIST has been continuously developing it since 2000.

To find out how well an algorithm works, the examiners had him compare photos from two sets of data: one consisted of well-lit portrait photos, the other of webcam photos, taken under time pressure, as they are taken at border posts or security gates. They were marked with the addition “WILD” on the computer.

Then either only the “wild” webcam photo or both images were covered with Covid-19 masks – at least virtually. In order to keep the effort low, the masks were added by computer. It is clear to the authors that this reduces the informative value of their investigation: they were unable to find out how masks work in realistic situations – for example when they sit diagonally on their faces – nor did their simulated monochrome masks have patterns. In total, the testers took 6.2 million photos.


The eyes come into focus

Despite all the restrictions, the result shows rapid progress in the detection of masked people: the algorithms that were submitted since mid-March incorrectly rejected a quarter fewer photos than their predecessors from the period before the pandemic. So despite masks, they recognized much more often that two faces belong together. In the NIST study published at the end of July, the error rates were up to 50 percent, now it is only up to 40 percent. Above all, however, the so-called false rejection rate fell tenfold with some algorithms. It shows how often the right people were not recognized. This indicates that specific work has been carried out to identify masked people.

However, the NIST experts admit that they do not know for sure whether the algorithms were actually only developed after the outbreak of the pandemic, i.e. whether they are specifically geared to the detection of mask wearers. According to NIST, the developers knew that their algorithms would also be tested on masked faces. Obviously, they trained their algorithms more on the eye region. “The results of the latest algorithms show that if you focus on the periocular area, the recognition performance can be increased again,” says Christoph Busch. He is a computer science professor at the Darmstadt University of Applied Sciences and advises the Federal Office for Information Security on the subject of biometrics.

Even if the algorithms have learned something new, that does not mean that they recognize masked people as well as unmasked people. The false rejection rate for masked people is still significantly higher overall than for unmasked people – 2.4 to 5 percent instead of 0.3 to 0.5 percent. The NIST experts put it this way: The detection of masked people is at the level of the detection of unmasked people in 2017.


There were a particularly high number of false hits when both the “wild” image and the well-lit image were virtually masked. This means that anyone who unlocks their smartphone using face recognition and stores a masked image in the application makes the device more susceptible to being hijacked by an unauthorized person. With a mask, you have a better chance of outsmarting the system.

Based on the NIST findings, researchers should now find out how they could further increase the accuracy of the systems, writes Isabelle Moeller from the Biometrics Institute, an association of authorities and companies that work with facial recognition. According to her, higher image resolution, 3D recognition technology or infrared could help.

Black and red masks confuse technology more than white and blue

Government agencies use facial recognition at borders and airports, in search databases or when processing visa applications. Opponents of technology see privacy at risk. They warn that government and commercial surveillance may at some point track down people anywhere. Current systems have also been criticized for often triggering false alarms. Members of ethnic minorities – who are disproportionately affected by police checks in many countries – are particularly common. Many algorithms are trained on data sets that mainly consist of photos of white people.


After reading the NIST study, Kurt Opsahl of the civil rights organization Electronic Frontier Foundation gave those who fear surveillance a tip: the larger the mask, the higher the error rate of the algorithm. And: red and black masks provoked more errors in recognition than white or blue. The scientists don’t yet know why. Another problem that the developers of the algorithms should now pounce on.