The authors of the images in which twenty girls from Almendralejo (Badajoz) appear stripped naked using Artificial Intelligence are between 12 and 14 years old, although some of them are already 14 years old, so they are attributable according to the Criminal Law for Minors, as confirmed by National Police inspector Javier Izquierdo, head of the First Group for the Protection of Minors of the Central Cybercrime Unit.
The identity of those involved has not been revealed or whether they reside in Almendralejo, but everything indicates that they would be young people who are in the environment of the victims, also all aged between 12 and 14 years, the inspector has specified. who is attending these days in Malaga a meeting organized by the National Police against online fraud and child exploitation through new technologies.
As for the criminal consequences, young people who are already 14 years old will be subject to the penalty established by the current Juvenile Criminal Law, explained Inspector Izquierdo, who pointed out that this is an open investigation that will soon be published. The Badajoz Juvenile Prosecutor’s Office will be notified. It will be this, precisely, that will have to decide the penalties requested for those minors who are not attributable – age less than 14 years -, he has reiterated. In this sense, he recalled that if they were over 18 years of age, the penalties for producing child pornography range from 2 to 5 years and if they are under 16 years of age, the penalty would be aggravated from 5 to 9 years.
The issue at this point is that, regardless of whether the images are fictitious, the Penal Code considers child pornography to be manipulated images of minors with high sexual content if they are realistic and their authors are producers of this type of pornography.
The complaints collected amount to a total of 22 and no new victims are expected to appear, said the inspector who confirmed that so far no statements have been taken from any of the minors.
According to the investigations carried out by the researchers, it appears that, a priori, the manipulated photographs have not left the WhatsApp and Telegram groups where they were originally distributed. However, the authorities plan to intervene on the mobile devices of the minors involved to try to minimize the exposure that the images may have on the Internet and avoid greater moral damage to the victims, the inspector stressed.
This is not the first case that the State Security Forces are aware of in which Artificial Intelligence has been used for the production of child pornography. It is something that has been detected for some time, but has not been publicized to avoid the call effect.
What happened in Almendralejo as a result of complaints filed by a group of parents after detecting that some minors had used images of their daughters and had removed their clothes using AI to create images of child pornography, has generated an ethical-legal debate regarding even To what extent, this type of technology poses a risk to the population because anyone can manipulate an image of another person to, for example, strip them and upload it to the Internet with the consequent moral damage that this entails for the victim.
“We are faced with the fact that any person, misusing this technology, could be a producer of images of child pornography,” indicated Inspector Izquierdo, “it is no longer necessary to have access to a minor, but with any photograph of a minor events could be recreated that are extremely serious” so he understands it is essential “to make the population aware of the risk that the misuse of this type of technology entails.”
The majority of today’s young people “are digital natives who have had early access to mobile devices and who handle the use of technology very well” and “it is not about putting fear in their bodies, but about making them aware of the risks of its misuse so that there is a safe digital environment,” he clarified.
Other challenges that the State Security Forces and Corps currently face in terms of cybersecurity are augmented reality or the metaverse. Streaming broadcasts of sexual assaults on minors also pose a risk and offer a new way to distribute child pornography and make this type of pornographic product available to anyone.
The production of images of a minor with sexual content, even if they are fictitious, if they are realistic, are considered the production of child pornography and it is something that the penal code includes today, but what happens when it comes to adults? “That’s something else,” said the inspector. “There we have to explore what possibilities the Penal Code offers,” either through the crime of discovery and revelation of secrets, or against moral integrity because “there is no jurisprudence on it.”