Grim Report Discovers First AI-Generated Child Sex Abuse Videos

Close-up of a person typing on a laptop. The person's hands are positioned on the keyboard, with a wristwatch visible on the left wrist. The background is bright, suggesting daylight, with light streaming in through a window. The surface looks like a wooden desk.

A report by the Internet Watch Foundation (IWF) has found that generative AI models are being used to create deepfakes of real child sex abuse victims.

The disturbing investigation by the UK-based IWF has found that the rise of AI video means that synthetic child sexual abuse videos are beginning to proliferate.

The IWF, which describes itself as the “front line against online child sexual abuse”, says it has identified AI models tailor-made for over 100 child sex abuse victims.

It gave the example of one real-life abuse victim whose abuser uploaded images of her when she was between three and eight years old.

The non-profit organization reports that Olivia, not her real name, was rescued by police in 2023 — but years later dark web users are using AI tools to computer-generate images of her in new abusive situations.

The criminals are collecting images of victims, such as Olivia who is now in her 20s, and using them to fine-tune AI models to create new material. Some of these models are freely available to download online, according to the report.

AI Video Technology Being Abused

AI video technology has made great strides this year and unfortunately, this is reflected in the report.

The snapshot study was made between March and April this year, the IWF identified nine deepfake videos on one dark web forum dedicated to child sexual abuse material (CSAM) — none had been previously found when IWF analysts investigated the forum in October.

Some of the deepfake videos feature adult pornography which is altered to show a child’s face. While others are existing videos of child sexual abuse that have had another child’s face superimposed.

Because the original videos of sexual abuse are of real children, IWF analysts say the deepfakes are especially convincing.

Free, open-source AI software appears to be behind many of the deepfake videos seen by the IWF. The methods shared by offenders on the dark web are similar to those used to generate deepfake adult pornography.

The IWF fears that as AI video technology improves, AI CSAM will become photorealistic. This comes as the IWF has already seen a steady increase in the number of reports of illegal AI images.


Image credits: Header photo licensed via Depositphotos.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Swift Telecast is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – swifttelecast.com. The content will be deleted within 24 hours.

Leave a Comment