1 In 10 Kids Are Using AI To Create Nudes Of Their Classmates And Share Them Online

AI

The world of artificial intelligence (AI) seems to be creating more areas of concern than help.

From encroaching into human jobs to creating easy ways to plagiarize content, to more disturbing things like deep fakes, morphing images, and even audio/video content, to the point where it’s difficult to tell what is real and what is fake, AI is being used for all this.

In a recent survey, it has been now found that young kids in school are potentially using AI to create nude images of their classmates and more, getting images from social media and other digital platforms.

What Did The Survey Find?

A survey conducted by Thorn, a non-profit organization focused on protecting children from sexual exploitation, questioned 1,040 minors aged 9 to 17 between November 3 and December 1, 2023.

The organisation based in the USA released its findings in research named Youth Perspectives on Online Safety, 2023, and asked children, from diverse backgrounds, about their experiences with child sexual abuse material (CSAM), “harmful online experiences” and more.

The survey wrote “Minor participants were recruited via existing youth panels or directly through caregivers at the time of this survey,” and that “Caregiver consent was required for minors’ participation in youth panels, as well as for those minors recruited directly for the survey.”

Speaking of AI being used non-consensually, the report wrote “In 2023, Thorn’s annual monitoring survey also asked about generative AI (GAI) being used by peers to create CSAM, including “deepfake” nude imagery of real children that can then be shared without their consent.

The results found that the majority of minors do not believe their peers are using this technology to create explicit images of other kids.

However, roughly 1 in 10 minors (11%) reported they knew of cases where their peers had done so, while another 10% reported they preferred not to answer.”

The report also specified that “1 in 8 minors, aged 9-12, reported having seen nonconsensually reshared SG-CSAM.”

The survey also claimed that one in seven minors admitted to sharing self-generated CSAM, which while could be consensual, is still seen as risky online behaviour, and how the consequences of it could be quite serious.

A report by 404 Media also highlighted that while these findings are certainly concerning, the alarming way of framing by such “anti-human trafficking” organizations is also not ideal.

Giving the example of one in seven “minors have shared their own SG-CSAM,” it wrote that “while these images are technically illegal and against the policies of all major internet platforms, the scary sounding term also covers instances in which, for example, a 17-year-old consensually sends a nude to their partner.”

The report also said, “While the motivation behind these events is more likely driven by adolescents acting out than an intent to sexually abuse, the resulting harms to victims are real and should not be minimized in attempts to wave off responsibility.”

Julie Cordua, CEO of Thorn also commented on this saying, “The fact that 1 in 10 minors report their peers are using AI to generate nudes of other kids is alarming and highlights how quickly online risks are evolving.”

She also said “This emerging form of abuse creates a potent threat to children’s safety and well-being. We must act swiftly to develop safeguards, educate young people about the dangers of deepfakes, and empower parents to have open conversations with their children about these risks. We can better protect kids in an ever-changing digital landscape by staying ahead of these technological shifts.”


Read More: Google Pay, Paytm, And PhonePe UPI Transactions Might Be Putting Women In Danger


This just brings to light of wrongful use of AI and the alarming spread of disturbing apps like the ‘nudify’ and more that can, using AI, turn a person’s otherwise general photograph into a nude photo.

In April of 2023, TikTok user Rachel (@rache.lzh) posted a video talking about how an anonymous user sent her pictures of herself that had been altered using AI to be in the nude on Instagram.

The images posted by the TikToker on her account were fully clothed, however, someone edited them to be non-consensually produced pornographic images.

In a video from 27th April, she tearfully said, “It was pictures of me that I had posted fully clothed, completely clothed. And they had put them through some editing AI program to edit me naked,” and that “They basically photoshopped me naked.”

Adding further she explained the ordeal as “And what’s even worse is that the next day when I woke up I was getting dozens of DMs of these images but without the watermarks. So this person paid to have the watermark removed and started distributing it like it was real,” adding that the people also added tattoos to her body and changed her body using AI.

In December last year, two boys aged 13 and 14 from Florida were arrested for allegedly creating deepfake nudes of their classmates using AI and charged with third-degree felonies, as per a report by Wired.

This horrifying act is not just limited to the USA but is spread across the globe.

In July this year, a Spanish juvenile court sentenced 15 minors to a year of probation after they spread AI-generated nude images of their female classmates through WhatsApp groups.

As per a press release from the Juvenile Court of Badajoz, the minors will also have to attend classes on “responsible use of information and communication technologies,” gender, and equality to understand the consequences of their actions.

According to reports, the teens used photos from the social media profiles of the girls and then used an AI app to superimpose the faces of the minor girls over “other naked female bodies.”

A 2023 investigation done by the United Kingdom-based Internet Watch Foundation (IWF) found that “20,254 AI-generated images were found to have been posted to one dark web CSAM forum in a one-month period,” and that in today’s time “most AI CSAM found is now realistic enough to be treated as ‘real’ CSAM.”

Even in South Korea, there is currently a massive deepfake scandal ongoing where men are said to be using Telegram chatrooms to share sexually graphic content of women, taken without their consent. This is said to include women of all kinds from young girls, students, military personnel, and even female family members as per reports.

Perhaps the alarming thing is the ease of access when it comes to such deepfake platforms or AI apps that promise to “undress any photo” or create nudes of any person as long as there is a photograph present within seconds.


Image Credits: Google Images

Feature Image designed by Saudamini Seth

Sources: Firstpost, Thorn Org, The Hindu

Find the blogger: @chirali_08

This post is tagged under: AI, AI misuse, deepfakes, deepfake nudes, AI-generated content, Thorn, Thorn organisation, child protection, sexual exploitation, sexual exploitation ai, sexual exploitation artificial intelligence, AI technologies, deepake scandal

Disclaimer: We do not hold any right, or copyright over any of the images used, these have been taken from Google. In case of credits or removal, the owner may kindly mail us.


Other Recommendations:

Men Are Creating AI Girlfriends To Verbally Abuse Them

LEAVE A REPLY

Please enter your comment!
Please enter your name here