Lawmakers Introduce Bill to Ban Sharing of Deepfake Porn – Urgent Action Needed

Date:

Updated: [falahcoin_post_modified_date]

A bipartisan group of lawmakers introduced a bill Thursday that would make it illegal to share deepfake pornographic images and videos without consent.

The legislation, led by Sens. Maggie Hassan (D-N.H.) and John Cornyn (R-Texas), is the latest effort to combat the non-consensual distribution of sexuality explicit deepfakes, which refers to images and videos made with advanced technology including artificial intelligence (AI).

These doctored images are often made to resemble women, often celebrities or other public figures, and are posted online without their permission.

If passed, the bill would create a new criminal offense for sharing these images along with a private right of action for victims to file a lawsuit against parties — including websites — that intentionally share the images. The criminal penalties would include a fine and up to two years in prison while the civil penalties can vary up to $150,000 in most cases, and sometimes more.

The sharing of intimate images without consent can cause extraordinary emotional distress and harm and can put victims at risk of stalking and assault, Hassan said in a statement Thursday.

Especially as technology advances to a point where it is hard to tell which photos and videos are real and which have been entirely faked, we need stronger guardrails that protect people’s safety, privacy, and dignity and prevent non-consensual intimate images from proliferating across the internet.

The bill was co-signed by Sens. Laphonza Butler (D-Calif.) and Angus King (I-Maine).

Earlier in the day, the White House issued its latest push for the technology industry to willingly cooperate on the issue of sexually explicit AI deepfakes.

As generative AI broke on the scene, everyone was speculating about where the first real harms would come. And I think we have the answer, said Arati Prabhakar, head of the White House Office of Science and Technology Policy, The Associated Press reported.

A document from the White House calls for action from AI developers, payment processors, search engines and those who control app stores, the AP reported.

Artificial intelligence and the companies that wield its possibilities are going to transform the lives of people around the world — there’s no doubt about that, Biden said on X Tuesday. But first, they must earn our trust.

Concerns over deepfakes have increased over the past year as AI continues to advance and becomes more pervasive.

In January, the spread of explicit AI-generated images of Taylor Swift accelerated the push for lawmakers and the White House to prevent the spread of deepfake porn.

[single_post_faqs]
Neha Sharma
Neha Sharma
Neha Sharma is a tech-savvy author at The Reportify who delves into the ever-evolving world of technology. With her expertise in the latest gadgets, innovations, and tech trends, Neha keeps you informed about all things tech in the Technology category. She can be reached at neha@thereportify.com for any inquiries or further information.

Share post:

Subscribe

Popular

More like this
Related

Revolutionary Small Business Exchange Network Connects Sellers and Buyers

Revolutionary SBEN connects small business sellers and buyers, transforming the way businesses are bought and sold in the U.S.

District 1 Commissioner Race Results Delayed by Recounts & Ballot Reviews, US

District 1 Commissioner Race in Orange County faces delays with recounts and ballot reviews. Find out who will come out on top in this close election.

Fed Minutes Hint at Potential Rate Cut in September amid Economic Uncertainty, US

Federal Reserve minutes suggest potential rate cut in September amid economic uncertainty. Find out more about the upcoming policy decisions.

Baltimore Orioles Host First-Ever ‘Faith Night’ with Players Sharing Testimonies, US

Experience the powerful testimonies of Baltimore Orioles players on their first-ever 'Faith Night.' Hear how their faith impacts their lives on and off the field.