Current News

/

ArcaMax

Fast rise in AI nudes of teens has unprepared schools, legal system scrambling for solutions

Josh Cain and Mona Darwish, The Orange County Register on

Published in News & Features

What happened at Aliso Viejo Middle School has played out several times at other schools this year. In April, the principal at nearby Laguna Beach High School told parents in an email that several students were being investigated for allegedly using online AI tools to create nude photos of their classmates. In March, five students were expelled from a Beverly Hills middle school after girls there said they were targeted in the same way.

Whether most school administrators across the country know it, the same type of AI-generated sexual harassment and bullying could already be occurring on their campuses, too, experts said.

“We’re way behind the curve,” said John Pizzuro, a former police officer who once led New Jersey’s task force on internet crimes against children. “There is no regulation, policy or procedure on this.”

Pizzuro is now the CEO of Raven, a nonprofit firm lobbying Congress to strengthen laws protecting children from internet-based exploitation. He said U.S. policymakers are still trying to catch up to a technology that only recently became widely available to the public.

“With AI, you can make a child appear older. You can make a child appear naked,” Pizzuro said. “You can use AI to create (child sexual abuse material) from a photo of just one child.”

Just within the last year, powerful apps and programs using AI have exploded in popularity. Anyone with internet access now can make use of chatbots that simulate a conversation with a real person, or image generators that create realistic-looking photos from just a text prompt.

 

Amid the surge, an untold number of tools have also emerged allowing users to create “deepfakes” — essentially, videos using the faces of celebrities and politicians, animated using AI to place them in not only satirical content, but also in nonconsensual pornography.

Along these lines, some apps offer “face-swap” technology that allows users to put an unknowing person’s face on the body of a pornographic actor in photos or videos. Other apps offer to “undress” anyone in any photo, replacing their clothed body with an AI-generated nude one.

When they first emerged, deepfake programs were still crude and easy to spot, experts said. But being able to tell the difference between a real video and a fake one could only grow more difficult as the technology gets better.

“(These programs) are lightyears ahead of where we could have imagined them a few years ago,” said Michael Karanicolas, executive Director of the UCLA Institute for Technology, Law and Policy.

...continued

swipe to next page

©2024 MediaNews Group, Inc. Visit ocregister.com. Distributed by Tribune Content Agency, LLC.

Comments

blog comments powered by Disqus