Privacy and digital rights advocates are expressing concerns over a newly signed law known as the Take It Down Act, which aims to combat revenge porn and AI-generated deepfakes. While the law is intended to protect victims by making it illegal to publish nonconsensual explicit images and giving platforms 48 hours to comply with takedown requests, experts are cautioning against potential pitfalls.
India McKinney, director of federal affairs at Electronic Frontier Foundation, warns that the law’s vague language and lax verification standards could lead to overreach and censorship of legitimate content. She highlights the potential for abuse, such as requests to take down consensual porn or images depicting queer and trans individuals in relationships.
Senator Marsha Blackburn, a co-sponsor of the Take It Down Act, has also sponsored the Kids Online Safety Act, which raises concerns about her views on harmful content related to transgender individuals. The Heritage Foundation has echoed similar sentiments about protecting children from certain content.
Platforms like Snapchat and Meta have expressed support for the law, but questions remain about how they will verify takedown requests from victims. Decentralized platforms like Mastodon may face challenges in complying with the 48-hour takedown rule, as they rely on independently operated servers.
Proactive monitoring of content using AI technology is becoming increasingly common, with companies like Hive working to detect deepfakes and child sexual abuse material. Reddit, for example, partners with organizations like SWGfl to identify and remove nonconsensual intimate imagery.
However, McKinney raises concerns about the potential for monitoring to extend into encrypted messages, as the law requires platforms to prevent the reupload of such images. This could lead to proactive scanning of all content, even in encrypted spaces like WhatsApp or Signal.
The broader implications of the Take It Down Act on free speech are also being scrutinized, especially in light of past actions by former President Trump to suppress unfavorable speech. Critics worry that the law could be used to censor dissenting voices and restrict access to certain information.
In conclusion, while the Take It Down Act is aimed at protecting victims of revenge porn and deepfakes, it raises complex issues around privacy, censorship, and free speech that will require careful consideration and oversight moving forward.