BUSINESS

Pedophiles on the dark web are using AI software to create sexual abuse content.

×

Pedophiles on the dark web are using AI software to create sexual abuse content.

Share this article
Pedophiles on the dark web are using AI software to create sexual abuse content.

Experts asked politicians to fix the current law so that it didn’t have any holes.
A group that keeps an eye on the internet is sounding the alarm about the growing trend of sex offenders working together online to use open-source artificial intelligence to create material that is sexually abusive to children.

Dan Sexton, the chief technology officer at the Internet Watch Foundation (IWF), told The Guardian in a story last week, “There’s a technical community among criminals, especially on dark web forums, where they talk about this technology.” “They share images and models for artificial intelligence. They are giving each other tips and guides.”

Sexton’s group has found that criminals are using open-source AI models more and more to make illegal child sexual abuse material (CSAM) and share it online. The story says that users can download and change open-source AI, which is different from closed AI models like OpenAI’s Dall-E or Google’s Imagen. Sexton said that criminals can now use this kind of technology, and they use the dark web to make and share photos that look real.
“From what we’ve seen, we think that the material is being made with open-source software that people have gotten, run on their own computers, and then changed. And that’s a much harder problem to fix,” Sexton said. “It has been taught what child sexual abuse materials are and how to make them.”

Sexton said that pictures of famous children and other images of children that are open to the public are talked about on the dark web. Images of children who have been abused are sometimes used to make new material.

See also  Ensuring Robust Security: A Comprehensive Exploration of the Secure Software Development Life Cycle (SDLC)

Sexton said, “All of these ideas are worries, and we’ve heard people talk about them.”
Christopher Alexander, the chief analytics officer at Pioneer Development Group, told Fox News Digital that one of the new risks of this technology is that it could be used to get more people involved in CSAM. On the other hand, AI could be used to help search the web for lost people, even using “age progressions and other factors that could help find trafficked children.”

So, creative AI is a problem, and AI and machine learning can be used to fight it, even if it’s just to find problems.
Meanwhile, Jonathan D. Askonas, an assistant professor of politics and fellow at the Center for the Study of Statesmanship at the Catholic University of America, told Fox News Digital that “lawmakers need to act now to strengthen laws against the production, distribution, and possession of AI-based CSAM, and to close loopholes from the previous era.”

Sexton said that in the age of AI, the IWF, which searches the web for CSAM and helps to organize its removal, could get many tips to remove this kind of content from the web. This is because this kind of content is already very common on the web.

“We believe that online sexual abuse of children is already a public health epidemic,” Sexton told The Guardian. “So, this isn’t going to help solve the problem. It will probably only make things worse.”

The policy head at the Bull Moose Project, Ziven Havens, told Fox News Digital that Congress must move to protect both kids and the internet.
“Because AI CSAM uses pictures of real abuse victims already out there, it isn’t very different from CSAM that wasn’t made by AI. It is disgusting and wrong on a spiritual level. Havens said, “The huge risks that this technology creates will have a huge effect on the health of the internet.” “When these companies don’t do their jobs, Congress must act quickly to protect kids and the internet.”

See also  Jonathan Majors' Girlfriend Meagan Good in New Movie About Domestic Abuse

Leave a Reply

Your email address will not be published. Required fields are marked *