Politics

/

ArcaMax

States move to label deepfake political ads

Gopal Ratnam, CQ-Roll Call on

Published in Political News

WASHINGTON — After 20,000 or more New Hampshire voters received a call with the artificial-intelligence-doctored voice of President Joe Biden asking them to skip the state’s primary in January, state officials were in a quandary.

Attorney General John M. Formella launched an investigation alongside others into the robocall that urged recipients to “save your vote for the November election,” ultimately identifying a Texas-based organization as the culprit. But New Hampshire lawmakers who say simply identifying the origins of these deepfakes isn’t enough are backing legislation that would prohibit them within 90 days of an election unless they’re accompanied by a disclosure stating that AI was used.

New Hampshire is now one of at least 39 states considering measures that would add transparency to AI-generated deepfake ads or calls as political campaigns intensify ahead of the November presidential election. The state’s measure passed in the House but not the Senate.

Like New Hampshire’s bill, other states’ efforts are largely focused on identifying content produced using AI as opposed to controlling that content or prohibiting its distribution, according to Megan Bellamy, vice president of law and policy at the Voting Rights Lab, a nonpartisan group that tracks election related laws in states.

“I think what we’re seeing is an effort [by states] to address a known, growing and evolving field of AI-generated content without overshooting it and crossing the line that would trigger First Amendment arguments or any other legal push backs,” Bellamy said in an interview.

In Wisconsin, Gov. Tony Evers, a Democrat, signed into law a measure that requires political ads and messages produced using synthetic audio and video or made using AI tools to carry a disclaimer. Failure to comply results in a $1,000 fine for each violation.

 

Fair-election groups like Voting Rights Lab say that doesn’t go far enough. The Wisconsin disclaimer requirement applies only to campaign-affiliated entities while leaving out other individuals and groups, Bellamy said.

She added that a $1,000 fine could lead a political action committee or a campaign to decide an AI-generated deepfake is worth it, if it goes viral and gets that message in front of voters.

The Florida legislature, meanwhile, passed legislation with a bit more teeth: failure to disclose the use of AI-enabled messages would result in a criminal misdemeanor punishable by up to a year in prison. The measure is awaiting the governor’s signature.

And Arizona is considering similar measures requiring disclaimers in 90-day period before an election, in which repeated failures could result in a felony charge.

...continued

swipe to next page

©2024 CQ-Roll Call, Inc. Visit at rollcall.com. Distributed by Tribune Content Agency, LLC.

Comments

blog comments powered by Disqus