Watch CBS News

AI was used to turn a teen's photo into a nude image. Now the teen is fighting for change to protect other kids.

Fake nudes generated by AI causing real harm
Fake nudes created by AI "nudify" sites are causing real harm, victims say | 60 Minutes 13:25

Francesca Mani was 14-years-old when her name was called over the loudspeaker at Westfield High School in New Jersey. She headed to the principal's office, where she learned that a picture of her had been turned into a nude image using artificial intelligence. 

Mani had never heard of a "nudify" website or app before. When she left the principal's office, she said that she saw a group of boys laughing at a group of girls that were crying. 

"And that's when I realized I should stop crying and that I should be mad, because this is unacceptable," Mani said.

What happened to Francesca Mani

Mani was sitting in her high school history class last October when she heard a rumor that some boys had naked photos of female classmates. She soon learned that she and several other girls at Westfield High School had been targeted. 

According to a lawsuit later filed by one of the other victims through her parents, a boy at the school had uploaded photos from Instagram to a site called Clothoff, which is one of the most popular "nudify" websites. 60 Minutes has decided to name the site to raise awareness of its potential dangers. There were more than 3 million visits to Clothoff last month alone, according to Graphika, a company that analyzes social networks. The website offers to "nudify" both males and females, but female nudes are far more popular. 

Dorota Mani and Francesca Mani
Dorota Mani and Francesca Mani 60 Minutes

Visitors to the website can upload a photo, or get a free demonstration, in which an image of a woman appears with clothes on, then appears naked just seconds later. The results look very real. 

Clothoff users are told they need to be 18 or older to enter the site and that they can't use other people's photos without permission. The website claims "processing of minors is impossible," but no one at the company responded when 60 Minutes emailed asking for evidence of that in addition to many other questions. 

Mani never saw what had been done to her photo, but according to that same lawsuit, at least one student's AI nude was shared on Snapchat and seen by several kids at school.

The way Mani found out about her photo made it even worse, she said. She recalled how she and the other girls were called by name to the principal's office over the school's public address system.

"I feel like that was a major violation of our privacy while, like, the bad actors were taken out of their classes privately," she said. 

That afternoon, Westfield's principal sent an email to parents informing them "some of our students had used artificial intelligence to create pornographic images from original photos." The principal also said the school was investigating and "at this time we believe that any created images have been deleted and are not being circulated."

Fake images, real harm

Mani's mother Dorota, who's also an educator, was not convinced. She worries that nothing that has been shared online is ever truly deleted. 

"Who printed? Who screenshotted? Who downloaded? You can't really wipe it out," she said.

The school district would not confirm any details about the photos, the students involved or disciplinary action to 60 Minutes. In a statement, the superintendent said the district revised its Harassment, Intimidation and Bullying policy to incorporate AI, something the Manis say they spent months urging school officials to do.

Francesca Mani feels the girls who were targeted paid a bigger price than the boy or boys who created the images. 

"Because they just have to live with knowing that maybe an image is floating, their image is floating around the internet," she said. "And they just have to deal with what the boys did."

Dorota Mani said that she filed a police report, but no charges have been brought. 

Yiota Souras is chief legal officer at the National Center for Missing and Exploited Children. Her organization works with tech companies to flag inappropriate content on their sites. She says that while the images created on AI "nudify" sites are fake, the damage they can cause to victims is real. 

"They'll suffer, you know, mental health distress and reputational harm," Souras said. "In a school setting it's really amplified, because one of their peers has created this imagery. So there's a loss of confidence. A loss of trust."

Fighting for change 

60 Minutes found nearly 30 similar cases in schools in the U.S. over the last 20 months, along with additional cases around the world. 

In at least three of those cases, Snapchat was reportedly used to circulate AI nudes. One parent told 60 Minutes it took over eight months to get the accounts that had shared the images taken down. According to Souras, a lack of responsiveness to victims is a recurring problem the National Center for Missing and Exploited Children sees across tech companies. 

"That isn't the way it should be. Right? I mean, a parent whose child has exploitative or child pornography images online should not have to rely on reaching out to a third party, and having them call the tech company. The tech company should be assuming responsibility immediately to remove that content," she said. 

Yiota Souras, chief legal officer at the National Center for Missing and Exploited Children
Yiota Souras, chief legal officer at the National Center for Missing and Exploited Children, speaks with Anderson Cooper 60 Minutes

60 Minutes asked Snapchat about the parent who said the company didn't respond to her for eight months. A Snapchat spokesperson said they have been unable to locate her request and said, in part: "We have efficient mechanisms for reporting this kind of content." The spokesperson went on to say that Snapchat has "zero-tolerance policy for such content" and "...act[s] quickly to address it once reported."

The Department of Justice says AI nudes of minors are illegal under federal child pornography laws if they depict what's defined as "sexually explicit conduct." But Souras is concerned some images created by "nudify" sites may not meet that definition.

In the year since Francesca Mani found out she was targeted, she and her mom, Dorota, have encouraged schools to implement policies around AI. They've also worked with members of Congress to try and pass a number of federal bills. One of those bills, the Take It Down Act, which is co-sponsored by Senators Ted Cruz and Amy Klobuchar, made it through the Senate earlier this month and is now awaiting a vote in the House. It would create criminal penalties for sharing AI nudes and would require social media companies to take photos down within 48 hours of getting a request.

View CBS News In
CBS News App Open
Chrome Safari Continue
Be the first to know
Get browser notifications for breaking news, live events, and exclusive reporting.