On May 19,Alyce Mahon, Eroticism & Art President Donald Trump and First Lady Melania Trump beamed to press and allies as they signed the administration's first major piece of tech regulation, the bipartisan Take It Down Act.
It was seen as a win for those who have long been calling on the criminalization of NDII, or the nonconsensual distribution of intimate images, and a federal pathway of redress for victims. Cliff Steinhauer, director of information security and engagement at the National Cybersecurity Alliance, explained it may be a needed kick in the pants to a lethargic legislative arena.
"I think it's good that they're going to force social media companies to have a process in place to remove content that people ask to be removed," he said. "This is kind of a start; to build the infrastructure to be able to respond to this type of request, and it's a really thin slice of what the issues with AI are going to be."
But other digital rights groups say the legislation may stir false hope for swift legal resolutions among victims, with unclear vetting procedures and an overly broad list of applicable content. The law's implementation is just as murky.
SEE ALSO: Trump administration detonates expansion of rural broadband access"The Take It Down Act’s removal provision has been presented as a virtual guarantee to victims that nonconsensual intimate visual depictions of them will be removed from websites and online services within 48 hours," said the Cyber Civil Rights Initiative (CCRI) in a statement. "But given the lack of any safeguards against false reports, the arbitrarily selective definition of covered platforms, and the broad enforcement discretion given to the FTC with no avenue for individual redress and vindication, this is an unrealistic promise."
These same digital rights activists, who had issued warnings throughout the bill's congressional journey, will also be keeping a close eye on how the act may affect constitutionally protected speech, with the fear that publishers may remove legal speech to preempt criminal repercussions (or flatly suppress free speech, such as consensual LGBTQ pornography). Some worry that the bill's takedown system, modeled after the Digital Millennium Copyright Act (DMCA), may over-inflate the power of the Federal Trade Commission, which now has the power to hold online content publishers accountable to the law with unlimited jurisdiction.
"Now that the Take It Down Act has passed, imperfect as it is, the Federal Trade Commission and platforms need to both meet the bill’s best intentions for victims while also respecting the privacy and free expression rights of all users," said Becca Branum, deputy director of the Center for Democracy & Technology (CDT)'s Free Expression Project. "The constitutional flaws in the Take It Down Act do not alleviate the FTC's obligations under the First Amendment."
Organizations like the CCRI and the CDT had spent months lobbying legislatures to adjust the act's enforcement provisions. The CCRI, which penned the bill framework that Take It Down is based on, has taken issue with the legislation's exceptions for images posted by someone that appears in them, for example. They also fear the removal process may be rife for abuse, including false reports made by disgruntled individuals or politically-motivated groups under an overly broad scope for takedowns.
The CDT, conversely, interprets the law's AI-specific provisions as too specific. "Take It Down’s criminal prohibition and the takedown system focus only on AI generated images that would cause a 'reasonable person [to] believe the individual is actually depicted in the intimate visual depiction.' In doing so, the Take It Down Act is unduly narrow, missing several instances where perpetrators could harm victims," the organization argues. For example, a defendant could reasonably get around the law by publishing synthetic likenesses placed in implausible or fantastical environments.
Just as confusing is that while the FTC's takedown authority for applicable publishers is vast, its oversight is exempt for others, such as sites that don't host user-generated synthetic content, but rather their own, curated content. Instead of being forced to take down media under the 48-hour stipulation, these sites can only be pursued in a criminal case. "Law enforcement, however, has historically neglectedcrimes disproportionately perpetrated against womenand may not have the capacity to prosecute all such operators," the CDT warns.
Steinhauer theorizes that the bill may face a general infrastructure problem in its early enforcement. For example, publishers may find it difficult to corroborate that the individuals filing claims are actually depicted in the NDII within the 48 hour period, unless they beef up their own oversight investments — most social media platforms have scaled back their moderation processes in recent years. Automatic moderation tools could help, but they're known to have their own set of issues.
There's also the question of how publishers will spot and prove that images and videos are synthetically generated, specifically, a problem that's plagued the industry as generative AI has grown. "The Take It Down Act effectively increases the liability for content publishers, and now the onus is on them to be able to prove that the content they’re publishing is not a deepfake," Manny Ahmed, founder and CEO of content provenance company OpenOrigins. "One of the issues with synthetic media and having provable deniability is that detection doesn’t work anymore. Running a deepfake detector post hoc doesn’t give you a lot of confidence because these detectors can be faked or fooled pretty easily and existing media pipelines don't have any audit trail functionality built into them.”
It's easy to follow the logic of such a strong takedown tool being used as a weapon of censorship and surveillance, especially under an administration that is already doing plenty to sow distrust among its citizens and wage war onideological grounds.
Steinhauer still urges an open mind. "This is going to open a door to those other conversations and hopefully reasonable regulation that is a compromise for everyone," he said. "There's no world we should live in where somebody can fake a sexual video of someone and not be held accountable. We have to find a balance between protecting people, and protecting people's rights."
SEE ALSO: The One Big Beautiful Bill Act would ban states from regulating AIThe future of broader AI regulation remains in question, however. Through Trump championed and signed the Take It Down Act, he and congressional Republicans also pushed to include a 10-year ban on state- and local-level AI regulationin their touted One Big Beautiful Bill.
And even with the president's signature, the future of the law is uncertain, with rights organizations predicting that the legislation may be contested in court on free speech grounds. "There's plenty of non pornographic or sexual material that could be created with your likeness, and right now there's no law against it," added Steinhauer. Regardless of whether Take It Down remains or gets the boot, the issue of AI regulation is far from settled.
Topics Artificial Intelligence Social Good Donald Trump
Women's mag tweets without a link for context are absurdly hilarious3 lessons I learned from bombing an interview that I thought was a sure thingThis train station billboard prank is, quite frankly, the stuff of nightmaresSilicon Valley VCs raise millions for an Aussie drone mapping startupSend your desktop to Mars with SpaceX's dazzling wallpapersApple's iMessage system leaves one bit of data open to authoritiesHome Depot pulls super creepy voyeuristic Halloween decorationCould a revenge porn case in Northern Ireland change Facebook across the planet?Tea in a spray can is the weirdest way to make your daily cupCould a revenge porn case in Northern Ireland change Facebook across the planet?Pepe the Frog cartoon added to online hate symbol databaseWomen's mag tweets without a link for context are absurdly hilariousConstruction worker really regretted his choice of socksKanye West played 'Famous' three times in Taylor Swift's hometownSend your desktop to Mars with SpaceX's dazzling wallpapersClever crocodile shows off terrifying new fishing technique'Hearthstone's most frustrating cards are getting nerfed in coming updatesAsian iOS 10 Messages stickers will up your kawaii quotientTyra Banks and her Lyft driver sang 'Hamilton' tunes at the top of their lungsWill and Kate face greatest diplomatic challenge yet, and it's phallic clams NES Classics are coming back to stores in June Senior class pulls off a truly awesome senior prank 'Solo' is something new in Star Wars. Old An analysis of James Delos' creepy speech in 'Westworld' 9 fictional podcasts to binge if you love sci This 'Friends' theory about why Ross and Monica had to be related is sort of mind Activists respond to Russia's misappropriation of their causes Twitter tests a way to minimize the voices of trolls Why Hawaii's Kilauea volcano could keep erupting for months There's footage of John Travolta dancing with 50 Cent and it's priceless Musk says latest Tesla crash isn't 'front page news,' but it is because of Autopilot Man slowly tumbles into river in undeniably hilarious footage Cersei and Varys had a reunion and the Instagram pics are just delightful Glowing, bioluminescent waves crash on beaches in California Japanese railway company apologises for train leaving 25 seconds early Facebook updates Stories to complete its Snapchat domination Deadpool crashes iconic movie covers and honestly, they're so much better Google just slashed the price on its massive storage plans with Google One upgrade Michael Jordan's NBA career will be a Netflix 10 For cryptocurrency to go mainstream, it needs tools like these
1.929s , 10217.8671875 kb
Copyright © 2025 Powered by 【Alyce Mahon, Eroticism & Art】,Defense Information Network