By Elif Hilal Umucu.
Why Is Our Face So Important?
In daily life, our faces are used by ID cards, driving licenses, videos, security cameras, school ID card photos and many other databases, by facial recognition technologies to identify real-life people or security videos and photos. The technology uses our faces, which are biometric data. [Dufaux, & Ebrahimi, 2006]. Unlike a password, your face is unique to you. Passwords can be stored and reset if necessary, but your face will not be able to do this. For example, if your eye or lip is hacked, there is no way to delete this information. world they live in, they demand those with capacity to shape that future to be responsible and act with determination and at scale.
Your face is "different from other forms of biometric data such as fingerprints because when this type of technology is used in public places" facial surveillance is almost impossible to avoid. Unlike your fingerprint, your face can be tracked or analyzed without your knowledge. Your face can also be the key to aspects protected under international law, such as your education or the right to practice your religion freely. For these reasons, facial recognition is highly intrusive and can violate privacy and personal data protection rights, along with many other rights.
Face recognition systems are generally designed to do one of three things. The first is to identify an unknown and unknown person: For example, it can be used to try to detect the unknown person on a surveillance camera, for example a police officer or watchtower on airlines.
Second, it is done to authenticate. For example, this face recognition system is used for unlocking systems in smart phones or computers. Third, it is designed to search for specific predefined faces.
This system can be used, for example, to identify card counters in the casino or certain customers in the mall.
Facial recognition technology dramatically reduces the time required to identify people or objects in photos and videos.
The researchers highlighted the "frightening assumptions that underpin much of the current deception about facial recognition", especially when used to categorize emotions or attributes based on individuals' facial movements or dimensions. Face recognition is also one of the biometric data with a high degree of reliability, but it may not be possible to distinguish similar siblings and identical twins using the facial recognition system. This is the only exception to this situation.
Face Recognition Systems
In the face recognition system, the distance between our two eyes, our jaw line, the width of our nose, our cheekbones, the shape of our mouth and other elements are analyzed and authenticated.
These systems can automatically identify or verify a person from a digital image or video frame by comparing the selected facial features with the stored image of the electronic document or face database (“Biometrics in identity”, 2019). According to the research of the company named Gartner, it was stated that the number of missing people will decrease by 80% by 2023 compared to 2018 with the increase in the use of face recognition (“Biometrics in identity”, 2019).
No technology is 100% correct. and does not provide one hundred percent efficiency. Face recognition technology is no different from this situation. This technology can make false claims and produce unwanted false results. For example, there has been an incident recently in New York. The identity information of a person named Ousmane Bah was used while an apple store in Boston was stolen worth 1200 dollars. Using Ousmane's credentials, the principal criminal made the impression that he was guilty. Apple identified Ousmane as the real culprit using its facial recognition system. As a result of the enlightenment of the incident, Ousmane sued Apple for $ 1 billion.
Biometric Data Law GDPR, KVKK and Texas Biometric Law
Biometric data captured by facial recognition systems is one of the special categories according to the EU General Data Protection Regulation (GDPR). Face recognition systems are defined as a category of data in need of extreme special protection. Under the GDPR, the processing of biometric data can be obtained by obtaining the explicit consent of the data owner. But in addition, there is access to biometric data for the public interest.
Although not explicitly regulated by the GDPR, face recognition systems are one of the most used methods of processing personal data. Face recognition as a data processing method must be evaluated very carefully by any organization to be 100 percent in line with the provisions of the GDPR [Cliza, Olanescu and Olanescu, 2018]. Video surveillance and GDPR compliance recommendations in the GDPR:
Cameras should be used wisely. It should target only specially identified security problems, and should minimize the collection of irrelevant images.
It is necessary to inform people about the rights of access to images and the processing of surveillance information regarding facial recognition.
CCTV and video recordings that no longer have a purpose should be deleted. Information should be kept for a minimum period of time. When the information is not needed, it should be deleted.
Policies and rules for legitimate monitoring should be easily accessible.
Employers should pay attention to the principle of data minimization when using new technologies.
The international transfer of employees' personal and biometric data should only take place if there is an adequate guarantee of protection.
Comply with the obligation to appoint a data protection officer. (only applicable to professionals or public institutions for the processing of personal and biometric data).
The latest technology products should be used for surveillance and video connections. Secure software must be constantly updated.
Publish a privacy impact assessment (PIA) to state that all CCTV cameras and facial recognition purpose are legitimate service.
For the secure storage of CCTV systems and biometric data records, the number of authorized personnel should be limited.
According to the Personal Data Protection Law in Turkey; It is prohibited to process personal data of special nature without obtaining the explicit consent of the data subject. The exception to this rule is the article in another law. I mean, if another law allows data processing (for example, to recognize the criminal or to maintain social order) then the data can be processed. The purpose of the personal data protection law is to prevent anyone access to biometric and personal data.
Texas Biometric Act is one of the most comprehensive laws that define biometric data. It has set four exceptions. With these four exceptions, it prohibited the disclosure of biometric data. These situations are:
Identification of missing or dead person
Completion of financial transaction
Situations permitted by state law
It can be expressed as the request of the police organization for arrest. (Krishan, et al., 2018).
There is also the Washington Biometric Act, but less comprehensive.
“Would you be happy if your facial data is being used by a company or government without your consent? For example, Illinois biometric privacy law makes it illegal to take someone’s photo without consent.” - Prof. Anil Jain, Michigan State University.
How Face Recognition Systems Work?
With programs like DeepFake, we have actually seen the importance of face recognition algorithms. FRS can typically be used for authentication, identification and tracking. Authentication or authentication is actually the simplest task for an FRS. A person who has a pre-existing relationship with an institution (already registered with the gallery) submits his biometric properties to the system and claims to be in the reference database or gallery.
The first step in the face recognition process is to capture the face image, also known as the probe image. This is done using a photo or video camera. So, in principle, we can include it in existing good quality "passive" CCTV systems. The system will then try to match the probe image with the template in the reference database. in this case there are two possible outcomes that are possible: (1) the person whose face is identified is not recognized, or (2) the person is recognized.
If the person is not recognized, that is, the identity is not verified, it may be possible that the person is fraudulent (ie, making an incorrect identity claim) or the system made a mistake (system error). The system can also make mistakes in accepting a claim that is actually false (this is called false acceptance). However, it is important to find a face in a video data stream, as we will show below. The efficiency of the entire system is highly dependent on the quality and characteristics of the captured facial image. The face recognition process begins with face detection and the process of extracting larger images, often involving other human faces.
Face Recognition Accuracy
The graph below shows "recognition accuracy" in identifying known and unknown persons using an unknown class and unknown trust in the classifier.
Currently, Openface has an accuracy of 0.9292 ± 0.0134 in the LFW benchmark. Benchmark tests provide great results for comparing the accuracy of different techniques and algorithms. Face images from both the LFW database and the FEI database to test the recognition accuracy of identifying unknown persons and create the unknown class in the classifier receipt.
What Are Possible Privacy Issues?
In investigating possible threats to privacy, the following questions are important:
Which faces are recorded by the cameras used to maintain order in society?
Are the data policies of the countries carefully determined and clearly stated at the point of face recognition systems?
Are people aware that they are being filmed for identification or criminal identification? Are there and how did they consent?
Are the policies for access to all information collected by the system carefully determined and clearly stated?
Where is the information stored? How many people can access this data?
From whom will people learn about face recognition systems?
“Scientific and technological progress might change people’s capabilities or incentives in ways that would destabilize civilization.” - Professor Nick Bostrom, Future of Humanity Institute, University of Oxford
Nick Bostrom’s fascinating paper surfaced in 2018 titled, “The Vulnerable World Hypothesis”
Worldwide FRS Studies and Facial Recognition Gate
The first face recognition case in the world occurred in the divided court in Cardiff in the European Union. South Wales Police became the first police in the UK to use facial recognition technology at sporting events. There were 310,000 people in the UEFA Champions League. The system has been used here.
Later, in France, the Marseille Administrative Court also made a decision regarding face recognition systems. The use of face recognition systems in French High Schools has come to the fore. Recording the images of the students without their consent was the subject of this court. As a result of the court decision, it was decided that the GDPR was violated. It was stated that different methods should be used for the benefit of the students.
The White Paper report was drafted by the European Commission as an artificial intelligence draft. But before it was released, it was leaked. The European Commission stated in this report that it will develop a system to prevent the abuse of face recognition systems. and decided to ban this technology for 5 years.
As another example, facial recognition systems have been used in the Temple of Heaven in Beijing. Considering the savings of toilet paper in this way, the government was obtaining biometric data at the same time while saving this money (“Biometrics in identity”, 2019).
For example, facial recognition technology was used for the G20 summit in July 2017. In Germany, Hamburg Police used this technology to detect people and possible criminal activities. Then the Hamburg Data Protection Commissioner (Hamburgische Beauftragte für Datenschutz und Informationsfreiheit) published a report on facial recognition technologies used at the G20 summit. It found that the use of this technology is not in compliance with data protection law. No valid reasons were found for using it at this summit. For this reason, legal responsibility has been initiated.
As another example, the Berlin police conducted a large trial of live facial recognition technologies at the train station in 2017 and 2018. The aim was to test technical performance and identify shortcomings. 300 volunteers participated in this live trial. The software tried to detect the photographed volunteers while passing through the train station.
Finally, from February 2020, the Metropolitan Police has been trying to stop criminals by using facial recognition systems in areas such as Oxford Circus. It was announced that they analyzed the right people in detecting the criminal.
In Japan, the Ministry of Justice started verification tests on facial recognition at Narita International Airport and Tokyo International Airport in 2014. At the same time, a system called "Face Recognition Gate" by Panasonic came to the fore.
The main goal in making the Face Recognition Gate was to expand the measures. Before that, fingerprint recognition systems provided this security. Face recognition gates use a facial recognition system to recognize people's identities. In order for this recognition to be useful, the image of everyone's face was previously recorded on the IC chip in the passport. After that, the images match.
Facial recognition doors made by Panasonic were designed a long time ago. Technically, it was started in the 1990s. This system was also used in the operation of the LUMIX digital camera series. This technology is currently used in international airports and large shopping malls. For example :
Three Facial Recognition Gates introduced for the arrival processes of Japanese nationals at Tokyo International Airport (Haneda Airport)
Hajime Tamura, who was responsible for the development of facial recognition technology at Panasonic, said, "At a glance, it may seem easy to recognize a face by comparing two face images, but this was different from what it seemed. In Japan, passports can be used for up to 10 years. This is with the face image recorded on the passport's IC chip. means that there is an age difference of up to 10 years between the person's current face. Of course, within a decade, the appearance of someone's face often changes dramatically. Naturally, Facial Recognition Gate has dealt with things like changes in hairstyle and glasses, but things like cosmetics and wrinkles. Despite these differences, of course, the system had to verify the person with certainty .. On the other hand, when there is another person with an extremely close appearance, the system has to separate the two and verify that they are 'another person'. We've worked hard to resolve this issue, but past knowledge By combining our technology and new technologies, we have succeeded in obtaining a high performance Face Recognition Engine with high accuracy”. (Hajime Tamura)
As a result
As the use of technology and artificial intelligence becomes widespread, the use and importance of our biometric data increases. Discussions continue in the international arena. Laws are being rewritten in the light of these developments. For example, this technology is beneficial for people wanted in connection with terrorism, sex offenders, long sentences, or missing children. This technology, which has become widespread in international borders and customs, requires new laws and regulations to protect personal and biometric data more securely in the future.
While the number of data shared on digital media and social media is increasing day by day, the level of awareness on this issue should also increase.
S. H Lin, “An Introduction to Face Recognition Technology”, Informing Science Special Issues on Multimedia Informing Technologies, 3:1, (2000).
R. Rathi, M. Choudhary & B. Chandra, “An Application of Face Recognition System using Image Processing and Neural Networks”, International Journal Computer Technology Application, 3:1, (2012), pp. 45-49.
R. A. Hamid & J. A. Thom “Criteria that have an effect on users while making image relevance judgements”, in Proceedings of the fifteenth Australasian Document Computing Symposium, (2010), pp. 76-83.
M. H. Yang, D. J. Kriegman & N. Ahuja, “Detecting Faces in Images: A Survey”, IEEE Transaction on Pattern Analysis & Machine Intelligence, 24:1, (2002), pp. 34-58.
P. M. Corcoran & C. Iancu, “Automatic Face Recognition System for Hidden Markov Model Techniques”, New Approaches to Characterization and Recognition of Faces, (2011). M. Rouse, “Biometrics Definition”, (2013). Retrieved November 23, 2016 from http://searchsecurity.techtarget.com/definitions/biometrics.
L. Sirovich and M. Kirby, “Low-dimensional procedure for the characterization of human faces, Journal of the Optical Society of America A, 4, (1987), pp. 519-524.
M. A. Turk & A. P. Pentland. “Face Recognition Using Eigenfaces.” MIT Vision and Modeling Lab. M. F. Smith, Software Prototyping: Adoption, Practise and Management (Mc-Graw-Hill, London, 1991)
Personal Data Protection Law numbered 6698 and dated 24.03.2016 has been published in the Official Gazette numbered 29677
About the Author
Elif Hilal Umucu is Co-Founder at Ankara Legal Hackers and works as a legal analyst and legal advisor in the Presidential Digital Transformation Office (Turkey).
She is a 21 year old technology lawyer. She writes articles on cryptology cyber threats, electronic software and ethical internet.
Elif Hilal Umucu, who knows coding since the first year of law school, continues to develop mobile and web applications. She also provides consultancy as an informatics lawyer. She is currently working on face recognition algorithms and biometric data.