Skip to main content

Teens Are Spreading Deepfake Nudes of One Another. It’s No Joke

Teens are sending deepfake nude images of classmates to each other, disrupting lives. Schools, technology developers and parents need to act now

Silhouette of little girl using mobile phone

Ahmet Yarali/Getty Images

This piece is part of Scientific American's column The Science of Parenting. For more, go here.

In February, a group of middle school students in Beverly Hills victimized 16 classmates by sharing deepfaked nude images of them using artificial intelligence. In New Jersey, boys at one high school reportedly targeted more than 30 girls before school officials found out.

Schools and families are grappling with online apps that allow someone to turn an ordinary clothed photo into a fake nude image. The incidents in California and New Jersey are two of many involving teens, technology and unforeseen legal ramifications. While parents and school officials generally know that a teenager taking or sharing a nude image of another child is an offense, many do not understand that making and sharing fake pornographic images from clothed pictures of real kids is a federal crime, and a slew of “nudify” or “undress” apps makes it easy for a teenager with a phone to break the law. The apps use artificial intelligence (AI) to produce a realistic-looking nude version of an individual’s clothed photo, which is often sourced from social media.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Before these apps, perpetrators could use Photoshop to paste a child’s face onto an adult porn actor’s body (or, worse, an abuse image of a different child). But that took time, a laptop or desktop computer, and some technical proficiency. Now, these AI-powered apps will do all the work, quickly and for free, making it easy to rapidly create pictures of multiple victims without their knowledge or consent.

While some police and state legislators are increasingly paying attention, schools, administrators and even police departments are often caught unprepared when teens use these technologies. These local entities end up failing the victims, who are overwhelmingly female. Parents can help keep their child from becoming a perpetrator by talking to them early about body autonomy and consent, and encouraging respect and empathy. When teens do create or share deepfake nudes, schools and law enforcement must act appropriately to protect teenagers’ lives from being disrupted.

Child sex abuse material (CSAM) is defined by federal law to cover not just real images of actual abuse, but also images that have been manipulated to look like a real, recognizable child (such as a particular student in class, or a child celebrity) is engaging in a sexually explicit act or pose. That includes images altered with generative AI, as a recent alert from the Federal Bureau of Investigation warns.

Some states now ban these so-called “morphed” images, including Florida, Louisiana, South Dakota and Washington, and similar bills are pending elsewhere. The Department of Education recently deemed the nonconsensual distribution of deepfake nudes to be a form of sex-based harassment that schools must address under Title IX. As a researcher who studies AI’s implications for child safety, I am hopeful that these updates will get more students and schools to take deepfake nudes seriously—unlike the school officials in one Seattle suburb who, despite students’ complaints, didn’t think images merited reporting to police because they were fake.

One of the many reasons it is critical to stop these crimes before they happen is that adults’ responses can make things worse. In Miami, two boys, ages 13 and 14, were arrested last winter and face third-degree felony charges. This was the first known instance of children being arrested and charged over deepfake nudes, and it deserves to be the last. The criminal prosecution of children is counterproductive: our abuse-laden juvenile justice system does lasting damage to young offenders. It’s especially hard to justify with early adolescents, who might not comprehend why what they did was wrong because their moral reasoning skills aren’t fully developed. They may think of a deepfake nude as a funny prank or fail to anticipate how copies could circulate out of control.

That said, nonconsensual nude deepfakes are acutely misogynistic, and the old excuse of “boys will be boys” is not (and never was) an acceptable response to sexual bullying. Victims of deepfake nudes can suffer “substantial” emotional and reputational harms. The teens who make or share the images need to be held accountable, but there are options besides criminalization. The Beverly Hills middle school expelled five students, and one New Jersey deepfake victim is suing her classmate.

Looking beyond punishment, an alternative approach is “restorative justice,” which centers healing and building mutual respect alongside acceptance of responsibility for one’s actions. An example of a restorative practice in schools is using a discussion circle, in which both victims and accused get to be heard and the accused is asked to take accountability for causing harm and repairing it. The restorative justice approach isn’t easy, but it can succeed in schools and could be more constructive and healing for everyone involved than prosecution, litigation or expulsion.

For parents and educators, the best way to stop deepfake nudes is prevention, not reaction. That means communicating with kids—even if it gets awkward.

It’s not unhealthy for youths to be curious about sex or have crushes on their peers, yet parents are often uncomfortable discussing sex and sexuality with their children, and schools often do not address these topics adequately. Teach your teen or tween that deepfake nudes are not OK, for the same reason we teach younger children about safe versus unsafe touch: “Your body belongs to you, so you get to set boundaries and make choices about your body. Other people have privacy and agency in their own bodies, and you need to respect theirs just as they need to respect yours.”

That bodily autonomy includes images. Viewed through the victim’s eyes, a deepfake nude isn’t a funny prank, harmless indulgence of curiosity, safer than sex, or less transgressive than watching porn. It’s a boundary violation, just like unwanted touching. Making or sharing such an image might get a kid in trouble, but more importantly, it hurts the other kid’s privacy, agency, dignity and reputation. It makes a choice about someone’s body that isn’t anyone else’s to make.

Trust and openness will be essential should your child become either a victim or bystander of a deepfake nude. If your child knows other kids are being targeted, they need the confidence and integrity to tell you or the school. If your child is the victim, they need to know they can bring the problem to you and hear that it’s not their fault and they needn’t feel ashamed. Your kid needs to know you’ve got their back.

Beyond conversations around sexuality and body autonomy,there are several concrete things parents can do to stop the spread of deepfake nudes. Talk to your children about the sites they visit and the apps on their phones. Age-appropriate parental controls can have their uses, but remember, kids deserve privacy too, more so the older they get. Installing monitoring software on their devices undermines parent-child trust while normalizing intrusive surveillance into personal life. Plus, many monitoring apps “for parents” are just rebranded stalkerware used by domestic abusers.

Advocate for your child’s school to develop a comprehensive plan for deepfake nudes, with parent and student input. It should include training for educators about the laws governing fake (and real) nude images and their legal obligations when students victimize other students, counseling and support for affected students, and inclusion in the sex ed curriculum of deepfake nudes and related topics like sextortion.

Kids can find nudify apps through app stores, search engines, and ads on social media. Companies like Apple, Instagram, X and Patreon tend to be responsive to bad PR—and AI services themselves are terrified of CSAM liability. Call on tech platforms to step up their efforts and to kick off nudify apps and nonconsensual nude imagery; and on AI companies to commit to ensuring only responsible uses of their products.

If your child appears in a nude image (real or fake) that’s spread online, a service called Take It Down can get it removed from participating social media services. It’s a free tool offered by the National Center for Missing and Exploited Children. The federal government’s new Know2Protect campaign has other resources for children and families.

While new state laws and the new Title IX rules may help combat the deepfake nude problem in schools, misogyny and child sex abuse are age-old problems, and new technologies for both will always keep cropping up. Totally eradicating nudify apps is unrealistic, yet shutting down some bad actors is possible. For greatest effect, the target should be the shady developers who know their software is enabling CSAM, not their teenage users. The answer to traumatized kids isn’t traumatizing more kids.

This is an opinion and analysis article, and the views expressed by the author or authors are not necessarily those of Scientific American.