For teen girls victimized by ‘deepfake’ nude photos, there are few, if any, pathways to recourse in most states

The FBI has warned that technology used to create pornographic deepfake photos and videos was improving and being used for harassment and sextortion.

Glitchy photo of young woman's face and neck
Leila Register / NBC News; Getty Images

Teenage girls in the U.S. who are increasingly being targeted or threatened with fake nude photos created with artificial intelligence or other tools have limited ways to seek accountability or recourse, as schools and state legislatures struggle to catch up to the new technologies, according to legislators, legal experts and one victim who is now advocating for a federal bill.

Since the 2023 school year kicked into session, cases involving teen girls victimized by the fake nude photos, also known as deepfakes, have proliferated worldwide, including at high schools in New Jersey and Washington state.

Local police departments are investigating the incidents, lawmakers are racing to enact new measures that would enforce punishments against the photos’ creators, and affected families are pushing for answers and solutions.

Unrealistic deepfakes can be made with simple photo-editing tools that have existed for years. But two school districts told NBC News that they believe fake photos of teens that have affected their students were AI-generated.

AI technology is becoming more widely available, such as stable diffusion (open-source technology that can produce images from text prompts) and “face-swap” tools that can put a victim’s face in place of a pornographic performer’s face in a video or photo.

Apps that purport to “undress” clothed photos have also been identified as possible tools used in some cases and have been found available for free on app stores. These modern deepfakes can be more realistic-looking and harder to immediately identify as fake.

“I didn’t know how complex and scary AI technology is,” said Francesca Mani, 15, a sophomore at New Jersey’s Westfield High School, where more than 30 girls learned on Oct. 20 that they may have been depicted in explicit, AI-manipulated images.

“I was shocked because me and the other girls were betrayed by our classmates,” she said, “which means it could happen to anyone by anyone.”

Politicians and legal experts say there are few, if any, pathways to recourse for victims of AI-generated and deepfake pornography, which often attaches a victim’s face to a naked body.

Join YouTube banner

The photos and videos can be surprisingly realistic, and according to Mary Anne Franks, a legal expert in nonconsensual sexually explicit media, the technology to make them has become more sophisticated and accessible.

A month after the incident at Westfield High School, Francesca and her mother, Dorota Mani, said they still do not know the identities or the number of people who created the images, how many were made, or if they still exist. It’s also unclear what punishment the school district doled out, if any.

The Town of Westfield directed comment to Westfield Public Schools, which declined to comment. Citing confidentiality, the school district previously told NBC New York that it “would not release any information about the students accused of creating the fake nude photos, or what discipline they are facing.”

Superintendent Raymond Gonzalez told the news outlet that the district would “continue to strengthen our efforts by educating our students and establishing clear guidelines to ensure that these new technologies are used responsibly in our schools and beyond.”

In an email obtained by NBC News, Mary Asfendis, the high school’s principal, told parents on Oct. 20 that it was investigating claims by students that some of their peers had used AI to create pornographic images from original photos.

At the time, school officials believed any created images had been deleted and were not being circulated, according to the memo.

“This is a very serious incident,” Asfendis wrote, as she urged parents to discuss their use of technology with their children. “New technologies have made it possible to falsify images and students need to know the impact and damage those actions can cause to others.”

While Francesca has not seen the image of herself or others, her mother said she was told by Westfield’s principal that four people identified Francesca as a victim. Francesca has filed a police report, but neither the Westfield Police Department nor the prosecutor’s office responded to requests for comment.

New Jersey State Sen. Jon Bramnick said law enforcement expressed concerns to him that the incident would only rise to a “cyber-type harassment claim, even though it really should reach the level of a more serious crime.”

“If you attach a nude body to a child’s face, that to me is child pornography,” he said.

The Republican lawmaker said state laws currently fall short of punishing the content creators, even though the damage inflicted by real or manipulated images can be the same.

“It victimizes them the same way people who deal in child pornography do. It’s not only offensive to the young person, it defames the person. And you never know what’s going to happen to that photograph,” he said. “You don’t know where that is once it’s transmitted, when it’s going to come back and haunt the young girl.”

A pending state bill in New Jersey, Bramnick said, would ban deepfake pornography and impose criminal and civil penalties for nonconsensual disclosure. Under the bill, a person convicted of the crime would face three to five years in jail and/or a $15,000 fine, he said.

Join YouTube banner

If passed, New Jersey would join at least 10 other states that have enacted legislation targeting deepfakes, according to Franks, a law professor and the president of the Cyber Civil Rights Initiative, a nonprofit group that combats nonconsensual porn.

The state laws targeting deepfakes vary widely in scope. Some of them, like ones in Texas and Wyoming, make nonconsensual pornographic deepfakes a criminal violation. Other states, like New York, have laws that only allow victims to bring forward a civil suit.

Franks said the laws are “all over the place,” noncomprehensive, and the constitutionality of the laws has been called into question.

“So you’ve got a patchwork of criminal charges, which are going to be difficult in these cases because the perpetrators are going to be minors, so that raises its own questions,” she said.

‘Probably just the tip of the iceberg’

It’s unclear how many young people have been victimized by AI-generated nudes.

The FBI said it is difficult to calculate the number of minors who are sexually exploited. But the agency said it has seen a rise in the number of open cases involving crimes against children. There were more than 4,800 cases in 2022, which grew from more than 4,100 the year before, the FBI told NBC News.

“The FBI takes crimes against children seriously and works to investigate the facts of each allegation in a collective effort with our state, local, and tribal law enforcement partners,” the agency said, adding that victims can face significant challenges when trying to stop the spread of the image or get it removed from the internet.

Franks said there are likely a lot more incidents and that they will only increase.

“Whatever we’re hearing about that floats up to the surface is probably just the tip of the iceberg,” she said. “This is probably happening quite a bit right now, and girls just haven’t found out about it yet or discovered it or the school is covering it up.”

At Issaquah High School in Washington state, a school district representative said a mid-October incident “involving fake, AI-generated imagery of students” continues to affect the student body.

In the Spanish town of Almendralejo, mothers say dozens of their middle school-aged daughters have been victimized with AI-generated nude photos created with an app that can “undress” clothed photos. Local police in New Jersey, Washington and Spain are investigating the high school cases.

In a June public service announcement, the FBI warned that technology used to create nonconsensual pornographic deepfake photos and videos was improving and being used for harassment and sextortion.

Meanwhile, the National Association of Attorneys General called on Congress in September to study AI’s effects on children and come up with legislation that would protect them from those abuses.

In a letter signed by 54 state and territory attorneys general, the group said it was concerned that “AI is creating a new frontier for abuse that makes prosecution more difficult.”

“We are engaged in a race against time to protect the children of our country from the dangers of AI,” the letter said.

Francesca and her mother said they plan to head to Washington, D.C., in December to personally urge Congress members to act, as they continue to advocate for updated policies within the school system and seek accountability for what happened.

Join YouTube banner

“We all know this is not an isolated incident,” Dorota Mani said. “It will never be an isolated incident. This is going to keep happening all the time. We have to stop pretending that it’s not important.”

The rise in incidents targeting high school girls follows the proliferation of AI deepfake apps and deepfake porn websites where such material is created, shared and sold.

A 2019 report from Sensity, an Amsterdam-based company that tracks AI-generated media, found that 96% of deepfakes created at that point were sexually explicit and featured women who didn’t consent to their creation. Many victims are unaware the deepfakes exist.

Franks said there is nothing parents and children can do to prevent the creation of deepfakes using their likenesses. Instead, Franks said schools and local law enforcement need to make an example out of perpetrators in cases that reach the general public, to discourage others from creating deepfakes.

“If you could imagine a dramatic and important response from the school in New Jersey or from the authorities in New Jersey to make an example out of the case, really strict penalties, people go to jail, you might get the discouragement,” Franks said.

“In the absence of that, it’s just going to become one more tool that men and boys use against women and girls to exploit and humiliate them and that the law basically has nothing to say about.”

 

Source: NBC News

 

 

 

 

 

 

Comments are closed.