Connect with us

Coin Market

Trump signs bill criminalizing nonsensenual AI deepfake porn

Published

on

US President Donald Trump has signed a bill criminalizing nonconsensual artificial intelligence-generated deepfake porn, which also requires websites to take down any illicit images within 48 hours.

Trump signed the bill into law on May 19, known as the TAKE IT DOWN Act, an acronym for Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks.

The bill, backed by first lady Melania Trump, makes it a federal crime to publish, or threaten to publish, nonconsensual intimate images, including deepfakes, of adults or minors with the intent to harm or harass them. Penalties range from fines to prison.

Source: Melania Trump

Websites, online services, or apps must remove illegal content within 48 hours and establish a takedown process.

Trump said in remarks given at the White House Rose Garden and posted to the social media platform Truth Social that the bill also covers “forgeries generated by an artificial intelligence,” commonly referred to as deepfakes.

Melania Trump had directly lobbied lawmakers to support the bill, and said in a statement that the law is a “national victory.”

“Artificial Intelligence and social media are the digital candy of the next generation — sweet, addictive, and engineered to have an impact on the cognitive development of our children,” she said.

“But unlike sugar, these new technologies can be weaponized, shape beliefs, and sadly, affect emotions and even be deadly,” she added.

Senator Ted Cruz and Amy Klobuchar introduced the bill in June 2024, and it passed both houses in April of this year. 

US the latest to ban explicit deepfakes

There has been a growing number of cases where deepfakes are used for harmful purposes. One of the more high-profile instances saw deepfake-generated illicit images of pop star Taylor Swift rapidly spread through X in January 2024

X temporarily banned searches using Taylor Swift’s name in response, while lawmakers pushed for legislation criminalizing the production of deepfake images.

Related: AI scammers are now impersonating US government bigwigs, says FBI

Other countries, such as the UK, have already made sharing deepfake pornography illegal as part of the country’s Online Safety Act in 2023

A 2023 report from security startup Security Hero revealed that the majority of deepfakes posted online are pornographic, and 99% of individuals targeted by such content are women.

Magazine: Deepfake AI ‘gang’ drains $11M OKX account, Zipmex zapped by SEC: Asia Express

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Coin Market

Canton, ZKsync clash over how blockchains enforce rules

Published

on

By

Matter Labs’ Alex Gluchowski says Canton isn’t a blockchain, while Digital Asset co-founders argue public chains aren’t that different.

Continue Reading

Coin Market

Crypto hackers stole $17B over past 10 years: DefiLlama

Published

on

By

Private key compromises led crypto hack losses over the past decade as recent DeFi exploits show attackers moving beyond smart contract bugs.

Continue Reading

Coin Market

South Korea tax agency opens bidding for crypto tracing tool

Published

on

By

South Korea’s tax authority is looking to build crypto transaction tracking software that can help track potential tax evaders.

Continue Reading

Trending