Scientists create an online game that challenges you to fool Artificial Intelligence by the face
Researchers want to make the population aware of the risks of artificial intelligence recognition of emotions, something that large corporations are already using.
The recognition of emotions through artificial intelligence is not something that you have only seen in science fiction movies, but it is also present in reality, in many places where you may never have realized it, and researchers they want to expose the risks of this type of technology that could destroy people’s privacy.
Certain organizations are betting on an industry of emotion recognition technology, something that could be used in market research, road safety and even in education or in the work environment. The general public does not trust that a camera can, not only recognize you by your face, but also know how it feels.
As they point out in a report in The Guardian, researchers from the University of Cambridge have created a website, specifically emojify.info, where anyone can test an emotion recognition system through their own webcam. The operation is very simple, you simply accept that the web page can access your camera, and then you will have to put a series of facial expressions to know if you are recognized by the intelligence system.
“The idea with this website is to make the public aware of the danger of this recognition technology and promote conversations on social networks. It is a form of facial recognition, but it goes further because, instead of just identifying people, it tries to read our emotions, our internal feelings on our faces.“said Dr. Alexa Hagerty, project leader and researcher at the University of Cambridge
Hagerty affirms that this website aims to make users aware that these emotion recognition systems are more common than we think, and that they have already been used in many places in situations ranging from hiring a job to safety an airport or even education, something that can end up discriminating.
On the web that we have pointed out and where you can try this technology, they affirm that no personal data is collected and all images are stored on the user’s own device.
One of the most criticized aspects of this emotion recognition technology is that it is easily interpretable and can be deceived, since the user could pretend to be crying or laughing, and the machine could classify a user as sad or as happy, when in reality it is not.
“The use of emotion recognition technologies is deeply concerning, as these systems are not only based on discriminatory and discredited science, but their use is also fundamentally incompatible with human rightsHagerty said.