(Getty Images)
While manipulating intimate images and videos is not new tactic used by sexual violence perpetrators, the recent expansion of generative artificial intelligence (AI) for consumer use has accelerated the creation of nonconsensual manipulated intimate materials, or sexually explicit “deepfakes.”
Deepfakes include imagery, video or audio that is produced or distributed without the consent of the subject and has been altered, potentially with artificial intelligence, to make it appear that a person is nude, partially nude or engaged in sexual conduct. To be clear, there are also deepfakes that are not sexually explicit, like the ones depicting politicians. However, for the purposes of this commentary, when I say “deepfakes” I am referring to sexually explicit deepfakes.
In the past, celebrities and public figures were the most common targets for deepfakes. Perpetrators once needed hundreds of photos to create a deepfake. Now, with generative AI, a deepfake can be created with just a single photo, making anyone a potential victim.
Victims of deepfakes often experience severe emotional distress and financial or reputational burdens. Victims may be threatened with physical or sexual violence. They may lose their job. They may pay thousands for an attorney, mental health support or to websites that monitor the internet for deepfakes to remove them. Some victims have died by suicide.
If you’ve been a victim of a deepfake, know that it was not your fault. This is something that’s done to you, not something that you caused. We all have a right to access our phones, social media accounts or dating apps without experiencing sexual violence.
Technology and AI themselves are not the problem though. Bad actors misusing technology to abuse, threaten and exploit others are. The good news is that there are steps we can take to prevent the creation and distribution of sexually explicit deepfakes.
AI platforms need to have rules and guidelines in place to prevent the creation of deepfakes. Social media platforms need to have an easy way to report abuse, and they need to dedicate resources to removing sexual content. We know that platforms have a streamlined process for quickly reviewing and removing content for copyright purposes. They need to show the same dedication to removing nonconsensual intimate images.
In North Dakota, lawmakers are actively working to prevent the creation and distribution of sexually explicit deepfakes. Sharing intimate images without or against consent is already against the law in North Dakota, and House Bill 1351 would make it a class A misdemeanor to produce, distribute or transmit sexually explicit deepfake images or videos.
As individuals, we need to report deepfakes or other nonconsensual intimate material when we see it. We need to talk to our loved ones about online sexual violence and what warning signs to look out for so that we can report perpetrators before further abuse happens.
Anyone can be a victim of a deepfake, but teens and young people are particularly vulnerable. If you’re a parent, talk to your kids about deepfakes and why creating and sharing them is harmful. Teach technology safety and make sure your children know that they can come to you for help if something happens to them. April is Sexual Assault Awareness Month, so there is no time like the present to start these conversations.
Finally, as North Dakotans, we need to examine how we think about and treat people online. We have a culture of “North Dakota nice,” but does that culture remain behind closed doors? The ultimate solution to ending the creation and distribution of deepfakes starts with asking ourselves, “Why are these images and videos being created in the first place?” Many perpetrators of deepfakes seek financial gain or power and control over the victim. But once we see our friends, neighbors and even strangers on the internet as equal, respectable people and not as sexual objects to be consumed, degraded and profited off of, we will address the root cause of deepfakes and sexual violence as a whole.
If you or someone you know has experienced domestic or sexual violence, free and confidential support is available 24/7. Visit the North Dakota Domestic & Sexual Violence Coalition website to view a directory of domestic and sexual violence victim advocacy centers and find support near you.
If you or someone you know is a victim of sexually explicit deepfakes or nonconsensual intimate image sharing, you can find support through the Cyber Civil Rights Initiative Online Safety Center or the National Center for Missing and Exploited Children’s CyberTipline.