Two Florida school students, aged 13 and 14, have been arrested and charged with third-degree felonies for allegedly creating deepfake nudes of their classmates using an artificial intelligence application.
The incident, which occurred at Pinecrest Cove Academy in Miami, Florida, is believed to be the first case in the United States where criminal charges have been filed about AI-generated nude images.The victims in this case were between the ages of 12 and 13.
The students were suspended on December 6, and the case was reported to the Miami-Dade Police Department. Following an investigation, the two boys were arrested on December 22 under a 2022 Florida law that criminalises the dissemination of deepfake sexually explicit images without the victim’s consent.
In states without laws to protect people, those who have been hurt are going to court. A teenager based out of New Jersey is suing a classmate for showing fake AI nudes.
At the same time, police in Beverly Hills are looking into a situation where students supposedly shared pictures that put real students’ faces on naked bodies made by AI. But since California’s law against having sexual pictures of kids doesn’t specifically talk about AI-made images, it’s not clear if a crime was committed in this case. The local school district in Beverly Hills, last week, voted to expel five students involved in the matter.
The US President Joe Biden issued an executive order on AI last fall, requesting agencies to report on banning the use of generative AI to produce child sexual abuse material. Congress has yet to pass a law on deepfake porn. Still, the recent introduction of the DEFIANCE Act of 2024 in both the Senate and House suggests that change may be on the horizon, with the effort garnering bipartisan support.
The incident, which occurred at Pinecrest Cove Academy in Miami, Florida, is believed to be the first case in the United States where criminal charges have been filed about AI-generated nude images.The victims in this case were between the ages of 12 and 13.
The students were suspended on December 6, and the case was reported to the Miami-Dade Police Department. Following an investigation, the two boys were arrested on December 22 under a 2022 Florida law that criminalises the dissemination of deepfake sexually explicit images without the victim’s consent.
In states without laws to protect people, those who have been hurt are going to court. A teenager based out of New Jersey is suing a classmate for showing fake AI nudes.
At the same time, police in Beverly Hills are looking into a situation where students supposedly shared pictures that put real students’ faces on naked bodies made by AI. But since California’s law against having sexual pictures of kids doesn’t specifically talk about AI-made images, it’s not clear if a crime was committed in this case. The local school district in Beverly Hills, last week, voted to expel five students involved in the matter.
The US President Joe Biden issued an executive order on AI last fall, requesting agencies to report on banning the use of generative AI to produce child sexual abuse material. Congress has yet to pass a law on deepfake porn. Still, the recent introduction of the DEFIANCE Act of 2024 in both the Senate and House suggests that change may be on the horizon, with the effort garnering bipartisan support.
Denial of responsibility! Swift Telecast is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – swifttelecast.com. The content will be deleted within 24 hours.