Dozens of women targeted by AI-generated nudes, experts raise alarm

The number of victims of AI-generated nude images has surged over the past year, raising serious concerns among experts and support organizations.

featured-image

The number of victims of AI-generated nude images has surged over the past year, raising serious concerns among experts and support organizations. Reports of digitally manipulated nude images rose from nine in the previous year to 59, marking a significant increase. At the same time, the availability of such AI-powered services grew by 45 percent, with RTL Nieuws identifying 47 apps and websites that allow users to submit photos and receive digitally altered, undressed versions.

Victims experience the same distress as if real nude photos of them were leaked, research shows. The impact is particularly severe for young girls, affecting their self-image and mental well-being. Women make up 99 percent of reported victims.



"It doesn't matter whether the photo is real or not," said Marthe Goudsmit Samaritter, a deepfake researcher at the Max Planck Institute. "The issue is that someone is perceived sexually without their consent. It is demeaning and should be recognized as a form of online sexual abuse rather than a joke or prank.

" The technology behind these images has advanced to the point where distinguishing between fake and real photos is increasingly difficult. Perpetrators use them for harassment, blackmail, or creating fake social media profiles in victims' names, sometimes sharing private details to encourage further abuse. Victims report receiving harassing calls or even strangers showing up at their homes.

Many of these undressing apps are easy to use and freely available. One of the most popular services, hosted on Telegram, has tens of thousands of monthly users. Individuals simply upload a photo, and the software removes clothing digitally.

Most services allow a few free uses before charging for additional images. Some platforms go further, generating entirely new nude images based on multiple uploaded pictures. Users can even specify particular sexual poses.

Creating these images typically costs a few euros. Lisa, whose real name is known to RTL Nieuws, was one such victim. "It was incredibly distressing," she said.

"I struggled with it for a long time." Her stepfather used AI to generate dozens of nude images from her photos, including pictures taken when she was just 12 years old. Now in her 30s, Lisa said the manipulation deeply affected her.

Her stepfather also created a fake Facebook profile, pretending to be Lisa and offering sexual services. The images were discovered by her mother, who subsequently divorced him. Well-known Dutch women, including influencer Jade Anna van Vliet and presenter Welmoed Sijtsma, have also been targeted.

Van Vliet shared an original image alongside the manipulated version on Instagram, while Sijtsma explored the issue in her documentary . Experts warn that deepfake technology has become so sophisticated that it requires fewer authentic images to create realistic fakes. "Previously, you could tell by errors like six fingers or unnatural backgrounds," said Nikki Lee Janssen of Offlimits.

"Now, the quality is so high that victims must not only prove the images are fake but also deal with the emotional fallout." Every service RTL Nieuws found focused solely on women. Goudsmit Samaritter explained that AI models for undressing are trained primarily on female bodies.

"If you upload a male photo, the software often fails to work," she said. Creating, sharing, or possessing deepfake nude images without consent is illegal in the Netherlands, but many perpetrators do not realize this. "Offenders see it as harmless fun," Janssen said.

"We need to change that mindset." Lisa filed a police report in 2023, but authorities were initially unsure how to proceed. With persistence from her and her lawyer, the public prosecutor eventually issued a penal order, avoiding a court trial.

Her stepfather received an 80-hour community service sentence, mandatory psychological treatment at De Waag, and was ordered to pay Lisa 500 euros in damages. "I wanted to confront him in court, but at least he didn't just get a warning," she said. "The mandatory treatment is a step in the right direction.

" Lisa wants greater awareness of the impact of these images and faster legal action against offenders. "AI is often discussed positively in the news, but there is a dark and painful side to this technology that people need to understand.".