“The first major U.S. law tackling AI-induced harm” was born out of the suffering of teenage girls, said Andrew R. Chow in Time. “In October 2023, 14-year-old Elliston Berry of Texas and 15-year-old Francesca Mani of New Jersey each learned that classmates had used AI software to fabricate nude images of them and female classmates.” The software could create virtually any image with the click of a button, and it was being weaponized against women and girls. When Berry and Mani sought to remove the so-called deepfake images, “both social media platforms and their school boards reacted with silence or indifference.” Now a bill is headed to President Trump’s desk that will criminalize the nonconsensual sharing of sexually explicit images—both real and computer generated, of minors or adults—and require that platforms remove such material within 48 hours of getting notice. The Take It Down Act, backed by first lady Melania Trump, passed the Senate unanimously and the House 409-2.
“No one expects to be the victim of an obscene deepfake image,” said Annie Chestnut Tutor in The Hill. But the odds of it happening are only increasing. Pornographic deepfake videos are “plastered all over the internet,” often to “extort teenagers.” With AI, “anyone with access to the internet can turn an innocent photo” into life-shattering pornography, said Kayla Bartsch in National Review. This poses a grave danger to young women. While AI deepfakes have gotten a lot of attention for the role they can play in elections, the risk this technology poses to kids has been “largely neglected.” But 15 percent of kids “say they know of a nonconsensual intimate image depicting a peer in their school.”
Sign up for The Week’s Free Newsletters
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.