top of page

AI AND DEEPFAKES: HOW OUR SOCIETY HUNTS WOMEN

by Salma Ahmed, may 30th 2025


Salma Ahmed Explores How AI And Deepfakes Are Used To Target Women. 


A Deepfake Is An Artificial Image Or Video (A Series Of Images) Generated By A Special Kind Of Machine Learning Called “Deep” Learning (Hence The Name). 

Women have been a target for misogynistic men for years. To expand on this, the reasons are layered, involving gender dynamics, digital culture, and psychological motivations such as power, domination, and control (especially over women). However, this is a topic for another article. 
Misogyny causes women to face new challenges on a daily basis. There might be attempts to help them, but the attempts to harm them are superior.  For example, a woman may be overlooked for a promotion in favor of a less qualified male colleague due to ingrained gender biases. 

Women have been haunted in a variety of ways in recent years, including image-based sexual abuse.They had to come to terms with seeing themselves naked or in pornographic content without their consent. No one should go through that, and yet, women do on a daily basis. This type of violation of privacy can have long-lasting effects on a woman’s mental and emotional well-being. It can cause feelings of shame, embarrassment, and helplessness. 

While the world is still behind in protecting women from all kinds of abuse, including image-based sexual abuse, new and more extreme methods of targeting women have emerged. They now have to face the weaponization of AI technologies. AI algorithms can be manipulated to target and harass women online by flooding their social media accounts with abusive messages and threats.

Starting in 2017, a new type of technology appeared. One that is known as deepfakes. Deepfakes, as mentioned earlier, are videos, pictures or audio clips made with artificial intelligence to appear real. Deepfakes were initially used to create memes or perform other benign tasks, but this quickly changed.

With the existence of misogyny and violence against women, many people, particularly men, saw deepfakes as another way to cause harm. They began manufacturing non-consensual intimate deepfakes with high speed and a wide target range. Not just female celebrities were targeted. It was enough to be targeted simply because you are a woman.

The numbers and statistics are horrific.

96% of deepfake videos, according to a 2019 report by the AI company Deeptrace, were essentially pornographic.*

The horror doesn’t stop here. When analyzing the content published on the top five deepfakes pornography websites, 100% of it was about women.
We can look at the numbers again and again to try and comprehend them, but the gravity of the situation will never change.
While companies such as Meta and X, praised the rise of AI and began to use it, they turned a blind eye to the technology’s negative aspects.

Women have always been targets of deepfakes,  but the issue received increased attention when nonconsensual deepfake pornography created for Taylor Swift was shared on X (formerly Twitter).  Swift is a victim, as are many women, but not all women have the same resources. Not all women have millions of fans ready to defend them. So we were left with an example that overshadowed the rest of the issue: how normal women, not just celebrities, are also targeted.
Both groups are victims, and they became victims because they are ordinary women. There are many truths to face when discussing AI and deepfakes, but one that no one should accept is how famous women, such as singers, actors, and YouTubers, have to get used to seeing deepfakes made about them. It is the kind of knowledge you cannot ignore, but you are forced to accept.

We shouldn’t have to accept deepfakes.
We should not be seeing an increasing number of women being used in non-consensual intimate deepfakes that are targeting women. We shouldn’t talk about how hyper-realistic the videos made are or how easily this content is distributed through social media platforms.

Yet, here we are faced with another ugly aspect of what it means to be a woman.

It’s not a surprise or news that governments don’t have laws in place to protect women against all forms of violence that they face. So it is not surprising that there aren’t enough laws to regulate AI and deepfakes, or to protect women.

Organizations and victims have attempted to get big tech companies and governments to listen and change, but no one has acted. This shows how normalized violence against women has become.

*Source: Adjer, H., Patrini, G., Cavalli, F., & Cullen, L. (2019). The State of Deepfakes: Landscape, Threats, and Impact. In https://regmedia.co.uk/2019/10/08/deepfake_report.pdf. Deeptrace. https://regmedia.co.uk/2019/10/08/deepfake_report.pdf


@stylebyrahel1 @ genet11 @fashionweekbrooklyn

Comments


bottom of page