Facial recognition: who owns the data of our faces?
Who does your face belong to? Of course, a silly question … right?
But what about the data generated from your face? And what does it mean for your face to become data?
There is already a lot of data on millions of faces. We have featured our faces on social media and on photos stored in the cloud.
But we have not yet determined who owns the data associated with the contours of our faces.
In the era of great technologies, we have to reflect on the expectations that we can and should have about who has access to our faces.
The recent riots in the United States Capitol have put the issue in the spotlight, as facial recognition becomes a vital tool for identifying those who participated in the protest:
What is the power of facial recognition technology? Are we ready to take it on?
Even before the riots, facial recognition technology was being used in many ways that we have probably not considered seriously enough, and many of us have voluntarily contributed to generating data about our faces, either explicitly or implicitly.
Facial recognition technology, for example, is very present in public spaces.
When deciding to use this technology to ensure compliance with the law, for surveillance or for other initiatives with clear social purposes, we must stop and ask ourselves: what are the costs of losing our faces in favor of data?
The consequences are serious, including for the right to privacy and our ability to live unsupervised.
Track our movements
In Belgrade, according to reports and a video by the NGO SHARE Foundation, made in support of its #hiljadekamera (thousands of cameras) initiative, high definition cameras will be deployed for various surveillance functions.
The director of SHARE, Danilo Krivokapić, maintains that the facial recognition technology of these cameras will track the movements of individuals as they wander through the Serbian city.
The photos that already exist in the system are compared with the data captured by the cameras and then analyzed by an artificial intelligence system.
This technology opens up the possibility of tracking a person’s movements in real time while moving around Belgrade. And it is not the only place where it occurs.
Governments and surveillance go hand in hand, and facial recognition technology gives them more options and ways to track and restrict the movement of people within their borders.
The city of London decided last year to deploy cameras with facial recognition capabilities alongside its 627,727 CCTV cameras. The move sparked protests.
Companies also use it
And not only governments want your face.
Last year, Cadillac Fairview, one of the largest commercial real estate companies in North America, was reported by the Office of the Canadian Privacy Commissioner for installing discreet cameras in 12 of its shopping centers, including Toronto’s iconic Eaton Center.
These cameras captured five million customer images and used facial recognition software that generated more data, including gender and age.
Although the images were deleted, the data generated from them was kept on a server by a third party.
In response to the privacy commissioner’s report, New Democracy MP Charlie Angus stated: “We have the right to be able to go to public places without being photographed, tracked and put into data tracking machines, whether for companies or for the police and the government. “
Unfortunately, Angus is wrong: there is no such right.
And since Cadillac Fairview did not keep the photos, but the data of the faces that appeared in the photos, the problem is the consent, not the violation of the right to privacy.
What rights do we have when we offer our faces to dataification?
Journalist Rebecca Heilweil documents the many ways we introduce facial recognition technology into our lives.
Many are familiar with Facebook’s photo tagging technology, which tags not just your face, but other people in your photos.
This technology is also present in the Google and Apple photo apps.
But this type of facial recognition technology is spreading to other areas.
For example, the automaker Subaru deploys it to detect distractions behind the wheel.
Apple offers functionalities called HomeKit that cross the data collected from various devices and uses facial recognition to tell you if a friend, recognized by your photos, is at the door.
Google’s Nest Hub Max uses facial recognition technology to literally search for you; in the same way that you are always listening to hear the words: “OK, Google.”
And Hirevue uses artificial intelligence to evaluate the images of potential employees and decide on their suitability and likelihood of success.
A fundamental part of who we are
The human face is one of the most basic things that young children recognize and learn as their brains order the world.
It is a fundamental part of who we are as a species, its importance is such that it can hardly be expressed in words.
Is the data associated with that face – that is, the digital representation of your face based on your real face or your photos – is part of that fundamental essence of you that you want to safeguard for yourself?
Or is that a naive hope in our world of data?
Which brings us back to the US Capitol insurrection.
It is certainly only fair that facial recognition technology is used to bring white supremacists to justice. But at what cost?
We know the biases of the existing data against people of color, women, and people with low incomes.
We are aware of the police using this biased data in the name of algorithmic surveillance, which has led to harassment of those challenged communities and the wrongful arrests of black people.
The stakes are high, not only for the forces of order, but for our right to privacy as individuals.
Our expectations about data collection and privacy are not in line with what data collection and storage really is, be it facial or not.
That is why it is important to consider our rights in their proper context.
Our personal data has been and is being collected every day at an astonishing rate.
This is causing a fundamental change not only in economic and ethical terms, but in the way we live as human beings.
Our understanding of human rights and the corresponding laws to protect them need to be reset to take into account the changes that are taking place in the way our data is collected.
* Wendy H. Wong is Professor of Political Science at the University of Toronto
This note originally appeared on The Conversation and is published here under a Creative Commons license.
Click here to read the original version.
Remember that you can receive notifications from BBC News Mundo. Download the new version of our app and activate them so you don’t miss our best content.
BBC-NEWS-SRC: https://www.bbc.com/mundo/noticias-55771691, IMPORTING DATE: 2021-02-10 10:40:03