story

Report: Illegal Trade in AI Child Sex Abuse Images Exposed

    Report: Illegal Trade in AI Child Sex Abuse Images Exposed
    Last updated Jun 28, 2023
    Image credit: Wikimedia Commons

    Facts

    • A BBC report published Wednesday has found that pedophiles are using artificial intelligence (AI) tools to create and sell photorealistic images of child sex abuse, with buyers accessing the material via paid subscription services such as the US-based platform Patreon.
    • The creators of the material reportedly use software called Stable Diffusion, a popular AI tool among artists that makes images from simple text instructions by analyzing pictures found online. A separate report also found the software Midjourney was being used.
    • The content is also reportedly shared on the Japanese social media platform Pixiv, and because the website is hosted in Japan, where sharing sexualized cartoons and drawings of children is allowed, creators share the content using hashtags and groups.
    • Many creators will use Pixiv to post links to their Patreon accounts, on which they offer AI-generated child abuse for different prices depending on the content requested. The BBC found one account charged $8.30 per month for what it called "exclusive uncensored art."
    • It is illegal in the UK to take, create, share, or possess both indecent images and pseudo-photographs of people under 18. Pixiv said it has already banned such AI-generated content, and Patreon emphasized its "zero-tolerance" policy against depicting minors in a sexual manner.
    • Investigative journalist Octavia Sheepshanks said creators are producing images with the "aim to do at least 1,000 images a month." As law enforcement will likely get bogged down trying to distinguish between real and AI victims, the UK's lead officer on child safeguarding, Ian Critchley, added that it could also lead pedophiles down a path toward finding images of real victims.

    Spin

    Narrative A

    The rapidly growing industry of AI-generated child porn is a serious threat to society that needs to be tackled. Not only do these pedophiles spawn AI images by using footage of real victims, but the overwhelming amount of novel content makes it harder for police to distinguish between real and fake. Although companies understandably want to keep open-source code to promote artistic creativity, that may not be worth contributing to the evil child porn industry.

    Narrative B

    AI depictions of minors do pose serious concerns on many fronts, but they can also help law enforcement catch pedophiles before any real children are hurt. For example, police created a fake young girl to combat child webcam sex trafficking; thousands of men around the world were then exposed for paying for her content. Hopefully, as is underway in Australia, individual law enforcement agencies can each create their own AI to find and arrest more predators.


    Articles on this story

    Sign up to our newsletter!