Friday, Sept. 13, 2024

AdvertiseDonateSubmit
NewsSportsArtsOpinionThe QuadPhotoVideoIllustrationsCartoonsGraphicsThe StackPRIMEEnterpriseInteractivesPodcastsClassifieds

IN THE NEWS:

Dear UCLA | Orientation Issue 2024

Survivors of AI-generated sexual abuse need more resources

By Margaret Garcia

Aug. 25, 2024 3:30 p.m.

Editor’s note: This article contains mentions of AI-generated sexual abuse and suicidal ideation that may be disturbing to some readers.

From a Beverly Hills middle school where male students created fake nudes of their female classmates, to the college student whose fight for justice after seeing her face in deepfake pornography forms the core of the documentary, “Another Body,” it is clear that the recent rise of AI-generated sexual abuse, primarily deepfakes, have given students easy access to create nonconsensual sexually explicit content of their peers.

Although middle and high school perpetrators have faced repercussions through expulsion and criminal charges, it is unclear from the University of California’s existing Title IX policies and websites how they will address similar situations. Moreover, the UCs lack information online for student survivors of image-based abuse on how to access mental health resources for this abuse or support in removing abusive images from the internet.

AI-generated sexual abuse can have profound and long-lasting impacts on a student’s mental health, academic success and emotional well-being. Thus, we urge the UCs to do more to support victims of image abuse as easy accessibility to this technology can put students at risk.

Emerging research on AI-generated sexual abuse shows that women whose images are used in nonconsensual deepfakes feel violated, humiliated, anxious, fearful, develop thoughts of self-harm and can face ramifications impacting their professional and personal lives.

In an article from People Magazine, Breeze Liu, a graduate of the University of California Berkeley, described her mindset when she discovered a video of herself taken without her consent was posted on an adult site by another Berkeley student she had met before graduating.

“I was devastated,” Liu said. “I was so humiliated and felt so alone.”

She said she felt suicidal ideation as a result of the online image abuse.

After contacting Berkeley city police, she said she was told there was nothing they could do.

“They asked me if I’d ever engaged in prostitution or had exchanged nude videos for money,” Liu said.

The video posted by the perpetrator was subsequently used to create hundreds of AI-generated deepfake videos posted on adult websites. Ultimately, Liu found over 800 links to pornographic content made with her likeness online.

Ms. Liu’s account demonstrates the need for UC police to gain further knowledge of how to address the victims of image abuse, provide mental health resources, guidance in the process of removing these images and legal options.

Despite growing reports of AI-generated sexual abuse and its detrimental impacts, there are practically no UC resources available for victims. None of the UC Title IX websites mention this type of abuse or clarify whether it is included in definitions of sexual violence that would result in a Title IX investigation.

Additionally, the UCs currently do not offer any resources for victims to access mental health support or connections to resources that could help remove abusive content. It is an ideal time for the UCs to focus on how they can support victims of AI-generated sexual abuse, especially following recent changes to Title IX regulations that were announced last spring.

The Department of Education’s Title IX final rule, which outlined those legal changes, states that online harassment can include “the nonconsensual distribution of intimate images (including authentic images and images that have been altered or generated by artificial intelligence (AI) technologies).”

It is unclear what exact measures will be taken to address this specific issue, however, there are some recommendations that may be helpful to consider.

We at Survivors and Allies are dedicated to student survivors of sexual violence and know first-hand that survivors need both punitive Title IX investigations and tailored mental health resources. Currently, UC Title IX websites, trainings for students and staff and mental health resources are inadequate in addressing AI-generated sexual abuse.

As one of the most prestigious education systems, the UC network has the resources and strength to bring about change that can reach other universities and set the standard. Thus, we call on the UCs to:

  • Update definitions of sexual violence to include AI-generated sexually explicit content.
  • Update trainings for students, staff, CARE Advocate Offices, Counseling and Psychological Services and UC police.
  • Make resources available on UC websites to help students report or find content.
  • Provide access to mental health professionals trained in image-abuse violence.

If you would like to sign and make your voice known, please use the following link and share with anyone else whom you know who would like to make a change to the UC policies: https://chng.it/NZrZGJhbgR.

Margaret Garcia is a recent graduate of UC Merced with a B.A in psychology and minor in political science. She is a research member for the UCLA student organization Survivors and Allies and hopes she can improve campus resources for UC survivors.

Share this story:FacebookTwitterRedditEmail
Margaret Garcia
COMMENTS
Featured Classifieds
Apartments Furnished

433 Midvale Doubles, triples, or private rooms-minutes to UCLA, starting $995! Mention Daily Bruin ad to WAIVE application fees! Secure your spot today & live your best Bruin life! 424-544-5343

More classifieds »
Related Posts