Connect with us

Coin Market

Trump signs bill criminalizing nonsensenual AI deepfake porn

Published

on

US President Donald Trump has signed a bill criminalizing nonconsensual artificial intelligence-generated deepfake porn, which also requires websites to take down any illicit images within 48 hours.

Trump signed the bill into law on May 19, known as the TAKE IT DOWN Act, an acronym for Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks.

The bill, backed by first lady Melania Trump, makes it a federal crime to publish, or threaten to publish, nonconsensual intimate images, including deepfakes, of adults or minors with the intent to harm or harass them. Penalties range from fines to prison.

Source: Melania Trump

Websites, online services, or apps must remove illegal content within 48 hours and establish a takedown process.

Trump said in remarks given at the White House Rose Garden and posted to the social media platform Truth Social that the bill also covers “forgeries generated by an artificial intelligence,” commonly referred to as deepfakes.

Melania Trump had directly lobbied lawmakers to support the bill, and said in a statement that the law is a “national victory.”

“Artificial Intelligence and social media are the digital candy of the next generation — sweet, addictive, and engineered to have an impact on the cognitive development of our children,” she said.

“But unlike sugar, these new technologies can be weaponized, shape beliefs, and sadly, affect emotions and even be deadly,” she added.

Senator Ted Cruz and Amy Klobuchar introduced the bill in June 2024, and it passed both houses in April of this year. 

US the latest to ban explicit deepfakes

There has been a growing number of cases where deepfakes are used for harmful purposes. One of the more high-profile instances saw deepfake-generated illicit images of pop star Taylor Swift rapidly spread through X in January 2024

X temporarily banned searches using Taylor Swift’s name in response, while lawmakers pushed for legislation criminalizing the production of deepfake images.

Related: AI scammers are now impersonating US government bigwigs, says FBI

Other countries, such as the UK, have already made sharing deepfake pornography illegal as part of the country’s Online Safety Act in 2023

A 2023 report from security startup Security Hero revealed that the majority of deepfakes posted online are pornographic, and 99% of individuals targeted by such content are women.

Magazine: Deepfake AI ‘gang’ drains $11M OKX account, Zipmex zapped by SEC: Asia Express

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Coin Market

NY lawmaker proposes ‘AI dividend’ to address potential job losses

Published

on

By

The proposed AI dividend would be funded by taxes on AI use and equity stakes in AI companies, paying US citizens if automation meaningfully displaces workers.

Continue Reading

Coin Market

Coinbase’s AI payments protocol x402 launches app store for AI agents

Published

on

By

Coinbase’s x402 AI payments protocol has launched Agentic.market, a platform where AI agents can discover and use AI-friendly services.

Continue Reading

Coin Market

Code is ‘functional’ free speech under the First Amendment: Coin Center

Published

on

By

Crypto software developers are concerned about whether they could be held criminally liable for publishing their software, following high-profile convictions last year.

Continue Reading

Trending