As if campaign distortions were not enough, the 2024 election season is likely to be the first in which artificial intelligence is used on a wide scale to spread false information.
Alaska House Bill 358 addresses the use of AI to crease false identities and create harm, said Rep. Mike Cronk, the bill’s key sponsor, in introducing it to the House Judiciary Committee this week. “Everything can be called into question in this new high-tech environment,” he said. “Your voice, your image will only be yours and safe from hard if safeguards are put into place.”
An unsophisticated consumer version of deep fakes can be seen in this 2022 video based off of this one photo:

The Washington Legislature recently passed a new law regarding deep fakes in political ads. The law pertaining to “synthetic media” is aimed at audio or video recordings of an individual’s speech, appearance, or conduct that has been intentionally manipulated with the use of technology to produce what appears to be the image of that real person but with fundamentally different understanding that a reasonable viewer would have of that person had the image or recording not been manipulated.
It’s tricky because it can come down to a matter of interpretation. The phrase “intent to cause harm” is key to the legislation, Cronk’s staff member Dave Stancliff explained. He likened the sideboards on free speech as being the same as the standard already set: You don’t have a free-speech right to falsely yell “fire” in a packed movie theater.
“You can’t change a likeness of a politician, a business person, any type of leadership figure … their voice, their image,” Stancliff said. “Harm” is a result that would have to be determined in a trial, he said.
Rep. Andrew Gray says that he hosts a podcasts and does editing on it, rearranging the words. “I copy and paste people’s words into a different order to make things more clear. He said he is aware that he could take a speech by someone — President Joe Biden, for example — and change the order of the words so it would say the opposite of what the speaker intends, or change it simply by inserting the word “not” inside a statement or phrase.
“It’s very easy to do and doesn’t require A.I. [artificial intelligence],” he said.
HB 358 can be studied at this link.
Listen to the legislative committee discussion at this link.
