New AI Privacy tool uses cloaking to prevent facial recognition in photos.
A new AI privacy tool has been introduced in an effort to combat the ill effects of rapidly developing facial recognition software. Imagine knowing that a stranger can snap a photo of you and identify you within seconds. Even worse, imagine being misidentified by facial recognition software and having to deal with the consequences of a crime you never committed as a result. It has long been recognised that the use of facial recognition software could pose a serious threat to the privacy and freedom of individuals. The fact that photos that we share are being collected by companies in order to train algorithms, which are even sold commercially is quite troubling. The training of these algorithms helps the cause of developing this AI software quickly, however, it also speeds up the rate at which we ourselves are put at risk by this very software.
The dangers of facial recognition can be far reaching and solutions are becoming more and more necessary.
According to this Forbes article published last year, protesters in Hong Kong were being identified and targeted using facial recognition towers and cameras. The protesters would use various techniques including lasers, gas masks and eyewear to throw off facial recognition cameras and avoid possible detainment for protesting. One of the other main issues that many are presenting, is that facial recognition is mainly tailored to white men. This means that the rate at which people of colour, and more particularly women of colour are misidentified is extremely alarming. Even sharing photos in this day and age comes with a certain level of risk. For example, Rekognition, the Amazon facial recognition technology creates profiles for us, based on photos from our online profiles, shopping experiences and information from Amazon applications. The legislation that has been introduced thus far, and all the advice floating around on how to protect our privacy and identities, can only do so much. At some point it became apparent that more could be done as far as protecting the identity of individuals using creative tools.
This new AI privacy tool could help individuals avoid the dangers of facial recognition.
While legislation is being introduced to combat the detrimental effects of the use of facial recognition software, the question remains as to whether the law can keep up with the development and use of this technology. That is seemingly not the case, however new technology is being created, which could very well help in protecting the privacy and identity of individuals. One such solution is a tool called Fawkes, named after the Guy Fawkes masks donned by revolutionaries in the V for Vendetta comic book and film. It was created by scientists at the University of Chicago’s Sand lab. This tool uses artificial intelligence to very subtly and almost imperceptibly alter your photos to trick facial recognition systems. This tool is said to be 100-percent successful against state-of-the-art facial recognition systems like Microsoft’s Azure Face, Amazon’s Rekognition and Megavil’s Face++. While this tool is very helpful in protecting the identity of individuals, it can only protect the identity of those who choose to use it from the point at which they decide to use it. What this means is that those images, and the corresponding information collected by facial recognition companies in the past cannot be retroactively altered to protect one’s identity. There is also the view that without widespread use of a technology like this, it would have a little to no impact.
How does this new AI privacy tool work?
The cloaking technology will allow you to post selfies without the fear of having companies use them to identify you, or train their algorithms to do so. The Fawkes tool takes a couple minutes to process a photo, making changes that are imperceptible, but which would cause the facial recognition software to mistake you for someone else. Ben Zhao, a professor of computer science at the University of Chicago who helped create the Fawkes tool said “What we are doing is using the cloaked photo in essence like a Trojan Horse, to corrupt unauthorized models to learn the wrong thing about what makes you look like you and not someone else,” This cloaking is intended to corrupt the database that facial recognition systems need to function, which includes hordes of photos scraped from social media platforms.
While the law is certainly doing its part in curtailing the ill effects of using facial recognition, this software tool is expected to also play a role in slowing the development of facial recognition software by reducing the number of uncorrupted photos available for them to train their algorithms. This tool will also help individuals who are looking to protect their identity moving forward, especially as, in some cases, there is no way of knowing when and how photos are being used by companies.