top of page

What's Missing From the Conversation on AI? Women.

Unless you’ve been living an entirely disconnected life for the past few years, you will have heard of large language model applications like ChatGPT. There is no doubt that artificial intelligence (AI) is rapidly becoming increasingly intelligent. However, there is something we have been leaving entirely out of the conversation – the impact of this technology on women.

(c) Gertrūda Valasevičiūtė

You may have already heard of deepfake technology – that which allows someone to create false videos of anyone, but mostly targets celebrities and politicians. This same technology can be used to create deepfake pornography, and make it appear as though a particular woman has participated in a pornographic film – even though she never has. There is another type of similar technology which aims to allow users toto create nude images of women through the use of AI. One such app called DeepNude was pulled from app stores after public outcry on social media. The difference between the two is that deepfake pornography superimposes another person’s face onto the body of a porn actor, whereas nudifying tools aim to generate “nudified” images from fully or partially clothed images uploaded by a user.

An MP in the United Kingdom, Maria Miller, had in 2021 called to ban these tools. In speaking with the BBC in August 2021, she said that a debate needed to occur on “whether nude and sexually explicit images generated digitally without consent should be outlawed”.

So-called “revenge porn” is already a criminal offence in Canada under section 162.1 of the Criminal Code, where an intimate image is defined as “a visual recording of a person made by any means including a photographic, film or video recording” where this person is depicted exposing their sexual organs or engaging in sexual activity.

Although this specifies that these images could be made by any means, it is yet to be determined whether this law would apply to images generated with the use of artificial intelligence, as this has yet to be tested by Canadian courts.

It is difficult to find research on deepfake pornography or nudifying tools in Canada, but a report coming from the U.K.’s Law Commission pointed to a study that found that 100% of the victims of deepfake pornography are women. In fact, “nudifying tools” will generate female genital organs on any person uploaded, regardless of their sex. The technology assumes that the person you are attempting to nudify is a woman.

Technologies like those used to create deepfake pornography are already impacting the lives of women. Just this February, a streaming star on Twitch was victimised by deepfake porn technology, and noted that her fans thought the video is real. What Sweet Anita said in her interview with the New York Post is true – this could happen to anyone. Any woman wiho has uploaded images to a social media account, or has had images uploaded of her on her friend’s accounts, can be “deepfaked” or “nudified”. Since this particular incident, Twitch has updated its policy so that anyone “intentionally promoting, creating, or sharing” deepfake pornography, or what they refer to as “synthetic non-consensual exploitative images” could be “indefinitely suspended” upon a first offence.

Twitch’s move to indefinitely suspend those who share deepfake pornography, or other non-consensual sexually exploitative material on their website, is a step in the right direction. However, what Twitch is unable to do is address the root of the problem – the proliferation of the technology that allows such victimisation of women.

As we live in a world that is becoming increasingly digitised and as AI continues to proliferate, we, as a society, need to have conversations about how this technology can have a very real and negative impact on women’s lives, and our lawmakers need to act accordingly. We must put pressure not only on our governments but on technology companies to ensure the products they are creating and improving are not then weaponised against half of the global population.

bottom of page