Deepfakes began as a promising new technology in the field of visual effects but has become a tool to fuel revenge porn, fake nudes and fake news.
Deepfakes are computer generated images and video using real templates. These templates are composed of thousands of images taken from every possible angle of a subjects face, in order to map them onto the target’s face.
This has led to several hilarious and impressive videos surfacing as the technology became more available to amateur filmakers and enthusiasts.
Want to see the Shining with Jim Carey instead of Nicholson? Be my guest:
Home Alone starring Sylvester Stallone? Yup…
It’s a great sight gag and a wonder of modern technology, but the lack of regulation surrounding the technology has led to a proliferation of unwanted sexual images and videos.
At first this mostly happened to celebrities and other prominent public figures, but as the technology has become more widespread and available it has become a tool for the masses.
Dutch cybersecurity startup Deeptrace (now Sensity) published a report in October 2019 estimating that 96% of all deepfakes online were pornographic.
Nowadays anyone with a grudge can fake sexual images and videos of someone they dislike.
We are “playing catch-up” with deepfakes according to Nina Schick, author of the book Deep Fakes and the Infocalypse.
“It’s only a matter of time until that content becomes more sophisticated. The number of deepfake porn videos seems to be doubling every six months,” she said.
“Our legal systems are not fit for purpose on this issue. Society is changing quicker than we can imagine due to these exponential technological advances, and we as a society haven’t decided how to regulate this.
“It’s devastating, for victims of fake porn. It can completely upend their life because they feel violated and humiliated.”
The problem became so bad that in the UK new legislation was drawn up to tackle so called “Revenge Porn” as it has been termed.
Under this legislation Revenge Porn is described as:
“the sharing of private, sexual materials, either photos or videos, of another person, without their consent and with the purpose of causing embarrassment or distress.”
The term is slightly narrow in that it assumes or implies that all of these crimes are motivated by vengeful ex lovers, a more useful term that is widely used is ‘Image based abuse’.
Motivations for image based abuse can be:
- Financial gain
- Sexual gratification
Creation of such images is not in itself a crime however and that has lead to a number of programs and apps enabling anyone to create deepfake images.
This 2019 program used advanced neural networks to remove clothing from women.
The controversy of it being specifically created for porn led to its demise, with the creators ultimately shutting it down and having to refund consumers, whom would pay up to $50 for the service.
This bot uses artificial intelligence software to digitally undress women…essentially the same as DeepNude.
You may see a pattern here, these technologies are not going away.
104,852 women have had fake naked images of them shared publicly according to a report published by Sensity, a company dedicated to fighting the threat of deepfakes.
Founded in 2018, Sensity describes itself as “the world’s first visual threat intelligence” company. Their mission is “to defend individuals and organizations against the threats posed by deepfakes and other forms of malicious visual media.”
We should remember that tools to create deepfakes can be used for harmless fun but this is contrasted against the power they give to anyone able to create them.
One particular problem with deepfaking nudes is that the technology does not discriminate based on age, meaning these apps gift horrible power to pedophiles and those involved in the sexual exploitation of children.
According to Sensity, many of the images on Telegram appeared to be of underage women, “suggesting that some users were primarily using the bot to generate and share paedophilic content.”
The administrator of the site known by “P” commented on this particularly worrying issue, stating, “when we see minors we block the user for good.”
Deepfake technology has also been used to create fake audio, resulting in a number of scams in which thieves managed to con a chief executive at a UK energy company into transferring nearly $220 000 to a Hungarian supplier company because the man believed he was speaking to his boss.
This could well herald the next generation in telephone scams, imagine getting a call from your aunt asking if you could help pay for her medical bills, only to be scammed by some con artist.
Imagine the uncertainty you might feel in talking to anyone on the phone, especially when giving out personal information.
Deepfakes clearly have the potential to be utilised for far more than vindictive porn, and in a society ever more reliant on audio-visual platforms it is vital that we stay vigilant for those that would perpetuate it’s misuse.