The rush is on to fight so-called “deepfakes”—videos and audio altered to make a person falsely appear to say something that they did not. California has passed a ban that makes it illegal to share such videos within two months before an election.
Analysts have said that deepfakes will be used as political weapons ahead of next year’s elections.
“It’s hard enough to be informed, to make wise decisions about candidates and political issues,” said social media expert and San Jose State Communications Professor Dr. Matt Cabot. “But with deepfake technology it makes it vastly more complicated.”
U.S. & World
The new law targets people distributing altered video and audio to injure the candidate’s reputation or sway voters.
The candidate can seek damages if they can prove it was shared with malice up to 60 days before an election.
“I think consumers need to be more careful about what they post or retweet,” Cabot said. “Be slow to believe. Is it plausible? Did it come from a trusted news source? Did it come from two trusted news sources?”
“In the face of total inaction at the federal level, California must step up to protect our more than 20 million voters,” said State Assemblyman Marc Berman, the bill’s sponsor. “AB 730 will help deter nefarious deepfakes by holding accountable the bad actors who intentionally attempt to injure a candidate’s reputation or deceive voters into believing a candidate said or did something they never said or did.”
There are critics of the ban. The ACLU wrote a letter to Gov. Newsom urging him to veto the bill this week.
“Despite the author’s good intentions, this bill will not solve the problem of deceptive political videos,” it stated. “It will only result in voter confusion, malicious litigation and repression of free speech.”
With presidential primaries just months away, Cabot disagrees.
“The stakes are too high,” he said. “Democracy rests on citizens being informed, and that isn’t free speech. It’s fake speech.”