DeepFakes: A Complete Threat

 

Rana Ayyub is an Indian journalist whose work has exposed government corruption and human rights violations, and over the years she got used to trolls and controversies around her work but none of them have prepared her for what she faced in April 2018. She was sitting in a cafe when she saw it for the first time, a two minute twenty second video of her engaged in a sex act. She could not believe on her eyes because she never made a sex video but unfortunately thousands and thousands of people believed it was her. Ayyub said that sex is so often used to demean and shame women especially minority women who dare to challenge the powerful men, which she had done in her work. The fake sex video went viral in just 48 hours, all of her online accounts were flooded with screenshots of the video, with graphic rape and death rates, and lots and lots of abusive messages. An another online post suggested that she was available for sex and her home address and cell phone number was spread across the internet. The video was shared more than 40,000 times.

Now when someone is targeted with this kind of cyber mob attack, the harm is profound. Rana Ayyub's life was turned upside-down, for weeks she could hardly eat or sleep, she stopped writing and closed all of her social media accounts which is a very tough thing to do being a journalist and she was afraid to go outside of her home. What Rana Ayyub faced was a DEEPFAKE. Machine learning technology that manipulates or fabricates audio and video recordings to show people doing and saying things that they never did or said. Deepfake even maps the lip movement of people in videos and their expressions while speaking a particular word or alphabet, the more input data you provide in a deepfake algorithm the more accurate and realistic result you will get. You can view the deepfake video of President Barack Obama at https://www.youtube.com/watch?v=AmUC4m6w1wo . Deepfakes appear authentic and realistic but they are not, they are a total fallacious. This technology is still developing and is widely available. The most recent attention to deepfakes arouse in pornography.    

In early 2018 someone posted a tool on Reddit to allow users to insert faces into porn videos, and what followed this was a cascade, fake porn videos featuring many female celebrities was everywhere on the internet. And today if you go on Youtube you can find countless tutorials explaining the step by step instructions on how to make a deepfake on your desktop applications and soon we may be even able to make them on our cellphones. 

It's the interaction of some of our most basic human frailties and network tools that can turn deepfakes into weapon. As human beings we have a visual reaction to audio and video, we believe they are true on the notion that off course we can believe on what our eyes and ears are telling us. It is this mechanism that might undermine our shared senses of reality and we believe deepfakes to be true even when they are not and we are attracted to the provocative. We tend to believe and share information that is negative and novel. Researchers have found that online hoaxes spread 10 times faster than accurate stories. We are also drawn towards information that aligns to our own viewpoints, Psychologists call this tendency as  The Confirmation Bias and social media super charges this tendency by allowing us to instantly and widely share information that accords with our view points. 

Deepfakes have the potential to cause great individual and societal harm. Imagine a deep fake that shows Indian soldiers on the border of Pakistan burning a Quran, you can imagine that this deepfake can provoke severe violation against those soldiers. And what if the very next day there's another deepfake that drops showing the President of United States praising those soldiers who burnt the Quran. Now this will not only lead to violence in India or Pakistan but across the globe. Now some of us may think that it is too far to be true but in reality it is not. We have seen falsery spread on Whatsapp and other online message surfaces leading violence against the ethnic minorities and that was just text, now imagine if it was a video. 

Deepfakes have the potential to corrode the trust that we have in democratic constitutions. It can exploit  and magnify the deep distrust that we already have in politicians, business leaders and other influential leaders. They find an audience trying to believe them and the pursuit of truth is on the line as well. Technologist expect that with advances in AI soon it will become difficult if not impossible to differentiate between a real and a fake video. So how will the truth emerge in a place where everywhere there are false news and deepfakes spreading. Will we just proceed along without any resistant and believe what we want to? Not only we might believe the fakery but we also may start disbelieving the truth. We have already seen people invoke on the phenomena of deepfake to cast out the real evidences of their wrong doings, especially the politicians say of audio of their disturbing comments "It was just a fake one" and this is called the Liars Dividend. It is the risk that liars will invoke deepfakes to escape accountability for their wrong doings. 

We are going to need a pro active solution from tech companies, from law makers, law enforcers and the media. What we need from the tech companies or the social media platforms is to change their  terms of servicing community guidelines to ban deepfakes that can cause harm. Law is our educator, it teaches us about what is harmful or wrong and shapes our behavior by punishing the perpetrators and by securing remedies for the victims. Today law is not up to the challenge of deepfakes, across the globe we lack well-tailored laws designed to tackle digital impersonations that invade sexual privacy or the damage that is caused on the reputation and image of a person due to deepfakes.

When Rana Ayyub went to the law enforcers in Delhi she was told that nothing could be done, she was just told to delete her accounts and shut her laptop. This shows that we have a legal vacuum that needs to be filled. We need to devise legislation that would ban harmful digital impersonation leading to the identity theft as Iceland, UK and Australia have already devised. But law is not a complete cure, its a blunt instrument and we have to use it wisely. It also has some practical impediments, you can't leverage law against people you can't identify and find and if a perpetrator lives outside the country where the victim lives then you may not be able to insist the perpetrator to come into the local courts to face justice. So we are going to need a Coordinator International Response

Education has to be a part of our response as well, law enforcers are not going to enforce laws they don't know about and prefer problems they don't understand. Researches on cyber stalking has found that law enforcers lacks the training to understand the laws available to them and the problem of online abuse.  We need to frame new legislation with efforts and training, and education has to be aimed on the media as well. Journalists need to be educated on the phenomena of deep fakes so that they don't amplify and spread them. In fact each and every one of us need to be educated, we click, we share, we like even without thinking weather the content was real or a fake one. 

Rana Ayyub is still wrestling with the fallout, she still does not feel free to express herself on and offline. She still feels that there are thousands of eyes on her naked body even if intellectually she knows it wasn't her body, and she has frequent panic attacks especially when someone she doesn't know tries to click a picture of her, what if they are going to make another deepfake of herself. So for the sake of individuals like Rana Ayyub and for the sake of our democracy we need to do something right now.   

Comments

Post a Comment

Popular posts from this blog

The Brain Computer Interface: Brain Chips

Virginity Fraud

5G-More than a Cellular Network