The attorneys general in all 50 U.S. states, plus four territories, signed onto a letter calling for Congress to take action against AI-enabled child sexual abuse material (CSAM).
"While internet crimes against children are already being actively prosecuted, we are concerned that AI is creating a new frontier for abuse that makes such prosecution more difficult," the letter says.
Indeed, AI makes it easier than ever for bad actors to create deepfake images, which realistically depict people in false scenarios. Sometimes, the results are benign, like when the internet was duped into believing that the Pope had a fashionable Balenciaga coat. But in the worst cases, as the attorneys general point out, this technology can be leveraged to facilitate abuse.
"Whether the children in the source photographs for deepfakes are physically abused or not, creation and circulation of sexualized images depicting actual children threatens the physical, psychological, and emotional wellbeing of the children who are victimized by it, as well as that of their parents," the letter reads.
The signatories are pushing for Congress to establish a committee to research solutions to address the risks of AI-generated CSAM, then expand existing laws against CSAM to explicitly cover AI-generated CSAM.
Nonconsensual, sexually exploitative AI deepfakes already proliferate online, but few legal protections exist for the victims of this material. New York, California, Virginia and Georgia have laws that prohibit the dissemination of sexually exploitative AI deepfakes, and in 2019, Texas became the first state to ban the use of AI deepfakes to influence political elections. Although major social platforms prohibit this content, it can slip through the cracks. In March, an app purporting to "swap any face" into suggestive videos ran over 230 ads across Facebook, Instagram and Messenger; Meta removed the ads once notified by NBC News reporter Kat Tenbarge.
Overseas, European lawmakers are aiming to work with other countries to ratify an AI Code of Conduct, but negotiations are still in process.