Home » Study: Reports of nonconsensual nude images are ignored on X

Study: Reports of nonconsensual nude images are ignored on X

Study: Reports of nonconsensual nude images are ignored on X

[[{“value”:”

X (formerly Twitter) takes swift action when taking down deepfake nude images that are reported as copyright violations — but not when they’re reported under “nonconsensual nudity,” a study has found.

The paper, published by researchers at the University of Michigan and Florida International University, is an audit of X’s reporting systems and hasn’t yet been peer-reviewed, 404 Media reported. Researchers created five AI “personas” of young white women (to prevent further variables of race, gender, and age) and then made 10 replica images of each, resulting in 50 images. In terms of the ethics around generating deepfake porn themselves, researchers said these images underwent a “rigorous verification process” to ensure they didn’t represent an existing individual.

They posted these images to X on 10 “poster accounts” they created, and then they created five X accounts to report the images. Twenty-five images were reported as Digital Millennium Copyright Act (DMCA) violations, and the other 25 were reported as nonconsensual nudity.

Researchers then waited three weeks to see the results of these reports. All 25 images reported for copyright were removed from X within 25 hours. In contrast, none of the images reported for nonconsensual nudity were removed within the three-week waiting period.

“Our findings reveal a significant disparity in the effectiveness of content removal processes between reports made under the DMCA and those made under X’s internal nonconsensual nudity policy,” the study states. “This highlights the need for stronger and directed regulations and protocols to protect victim-survivors.”

X owner Elon Musk dissolved the platform’s trust and safety council in 2022, but the site has recently opened up two dozen safety and cybersecurity positions in the U.S. Mashable has reached out to X for comment.

Earlier this year, WIRED found that victims of nonconsensual deepfake porn leveraged copyright laws to take down deepfakes on Google.

If you have had intimate images shared without your consent, call the Cyber Civil Rights Initiative’s 24/7 hotline at 844-878-2274 for free, confidential support. The CCRI website also includes helpful information as well as a list of international resources.

“}]] Mashable Read More 

​ X removed deepfake nudes when researchers reported them for copyright, but they remained up when reported for nonconsensual nudity.