Non Cult Crypto News

Non Cult Crypto News

in

US lawmakers unite against AI abuse, support NO FAKES Act

A legal professional warns that the bill could lead to private censorship, posing risks to free speech and creators’ rights.

Own this piece of crypto history

Collect this article as NFT

COINTELEGRAPH IN YOUR SOCIAL FEED

As artificial intelligence advances, its capabilities are increasingly being misused, with some individuals exploiting the technology to defraud cryptocurrency users. In response to this rising threat, US lawmakers have introduced a new bill to protect citizens from AI-generated deepfakes.

On Sept. 12, Representatives Madeleine Dean and María Elvira Salazar introduced the Nurture Originals, Foster Art, and Keep Entertainment Safe (NO FAKES) Act. The legislation aims to protect Americans from AI abuse and address the spread of unauthorized AI-generated deepfakes. 

Source: Joe Morelle

In a press release, the US lawmakers said that the NO FAKES Act will empower individuals to take action against the malicious actors who create, post or profit from unauthorized digital copies of them. In addition, it will protect media platforms from liability if they take down offending materials. 

The announcement also claimed that the new law would protect innovation and free speech. However, not everyone is convinced that this is what will happen. 

Recipe for private censorship

Corynne McSherry, the legal director at the Electronic Frontier Foundation, a nonprofit organization focusing on digital rights, believes that the NO FAKES Act could be a “recipe for private censorship.” 

In August, McSherry wrote that the bill may be good for lawyers but would be a “nightmare” for everyone else. The legal professional said that the NO FAKES Act offers fewer safeguards for lawful speech than the Digital Millennium Copyright Act (DMCA), which protects copyrighted material. 

The lawyer said that the DMCA allows a simple counter-notice process that people can use to get their work restored. However, the lawyer said that NO FAKES requires someone to run to court within 14 days to defend their rights. McSherry explained: 

“The powerful have lawyers on retainer who can do that, but most creators, activists, and citizen journalists do not.”

The legal professional added that while AI fakes cause real harm, these flaws “doom the bill.” 

Related: McAfee introduces AI deepfake detection software for PCs

AI deepfake crypto scammers ramp up operations

In the second quarter of 2024, software company Gen Digital reported that scammers using AI deepfakes have stolen at least $5 million in crypto. The security firm urged users to stay vigilant as AI makes scams more sophisticated and convincing. 

Meanwhile, Web3 security company CertiK believes that AI-powered attacks will extend beyond video and audio and may target wallets with facial recognition.

A CertiK spokesperson told Cointelegraph that wallets using this feature must evaluate their readiness for AI attack vectors. 

Magazine: AI Eye: $1M bet ChatGPT won’t lead to AGI, Apple’s intelligent AI use, AI millionaires surge

This article first appeared at Cointelegraph.com News

What do you think?

Written by Outside Source

Kraken denies SEC claims, argues digital assets aren’t securities

QNT leads market gainers with 10% surge, analyst eye further gains

Back to Top

Ad Blocker Detected!

We've detected an Ad Blocker on your system. Please consider disabling it for Non Cult Crypto News.

How to disable? Refresh

Log In

Or with username:

Forgot password?

Don't have an account? Register

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

To use social login you have to agree with the storage and handling of your data by this website.

Add to Collection

No Collections

Here you'll find all collections you've created before.