In Spain, the case of almost two dozen girls whose AI-generated nude images were distributed at their schools and among their peers is causing a stir. The daily newspaper El País reports this and adds that 22 minors in Almendralejo are now affected. The youngest girl is 11 years old, the oldest is 17. At the moment, 10 suspects have been identified who are said to be responsible for the pictures. They allegedly used a smartphone app that allows people being photographed to be “undressed.” According to reports, investigations are underway by the local police and the national data protection authority.
Advertisement
Mothers organize themselves
According to El País, the case began a few days ago on the first day of school after the summer holidays. A 14-year-old was approached by a boy who claimed: “I saw a naked photo of you.” After she told her mother about it and explained that there was an AI app behind it, she contacted other mothers. 20 affected people were quickly identified; there were cases in four out of five secondary schools in the city. It has around 33,000 inhabitants and is located in Extremadura on the border with Portugal. One of the mothers is a gynecologist and has more than 130,000 followers on Instagram. When she shared a video there recounting what happened, it drew national attention.
The daily newspaper has since learned that, according to the investigation, a group of local youths are said to be responsible for the fake nude photos. They are said to have saved the girls’ profile photos from Instagram and WhatsApp, but in at least one case they also took photos themselves during volleyball training. All the images were then uploaded to an AI application that advertises the ability to “undress” people. One of the boys later even created a video from the pictures. Not all of the families had seen the images in question themselves, but the mothers were “told” that they existed.
The application itself cannot be found in the Android and iOS app stores, but must be installed manually, but there is also a Telegram bot. In one case, a girl was even blackmailed with it. Money was claimed from an apparently fake profile. When she refused, the fake nude picture was sent to her and she was unable to do anything other than block the profile. One of the mothers is quoted as saying, “If I didn’t know my daughter’s body, the picture would look real to me.” On Instagram she assures us that we are standing together on the matter and will “END IT NOW”.
Not a new technology
The technology allegedly used to generate the false nude images is not new in principle; more than four years ago there were the first reports about online services that “undress” people in photos for a fee. Back then you still had to upload a photo and pay $50. The development of the underlying AI technology has now made a leap and applications are significantly cheaper or completely free. As early as 2020, a similar application made it clear that photos of young girls in particular were being edited, “some of them clearly underage.”
(my)
To home page
#Spain #Teens #spread #AIgenerated #nude #photos #girls