The specter of abundant disinformation concocted with artificial intelligence (AI) already loomed over the race for the White House, but recent fake calls imitating the voice of President Joe Biden are exacerbating experts’ fears.
• Read also: Biden largely wins, unsurprisingly, the Democratic primary in South Carolina
• Read also: Investigation: Rigged calls imitating Biden discourage voters from voting Tuesday
“What a load of rubbish!” proclaims the message broadcast by automated call (robocall) usurping the voice of the president, digitally manipulated (deepfake) to encourage the inhabitants of New Hampshire not to vote in the Democratic primary.
This message pushed local authorities to quickly launch an investigation into a potential “illegal attempt to disrupt” the vote at the end of January.
Researchers specializing in disinformation fear an increase in misappropriations in this crucial election year, thanks to the many tools now available to clone votes with AI.
Especially since, according to them, they are cheap, easy to use and difficult to trace.
“This is definitely the tip of the iceberg,” Vijay Balasubramaniyan, managing director and co-founder of cybersecurity company Pindrop, told AFP.
“We can expect to see a lot more deepfakes during this election period,” he warns.
According to an analysis carried out by Pindrop, voice cloning software developed by startup ElevenLabs was used to create Joe Biden’s fake calls.
“Political chaos”
A case that arises at a time when campaign teams from both camps are polishing their AI-enhanced paraphernalia to circulate their positions and investors are pouring millions of dollars into voice cloning startups.
Asked several times by AFP, ElevenLabs did not respond. Its website offers a free cloning version “to instantly create natural voices in any language with AI.”
Its recommendations for use specify that it is permitted to clone the voice of political figures in a “humorous or parodic” manner that is “clearly identifiable as such” by the listener.
Joe Biden’s fake calls have exposed the possibility that malicious people could use artificial intelligence tools to dissuade voters from going to the polls.
“The moment for political deepfake has arrived,” notes Robert Weissman, president of the civil rights group Public Citizen.
“Parliamentarians must act quickly to erect protections otherwise we are heading towards political chaos,” he told AFP. “The New Hampshire deepfake illustrates the many ways digital manipulation can sow confusion.”
The November presidential election is widely seen as the first election marked by AI in the United States, with experts concerned about its impact on trust in the democratic process, with voters struggling to discern fact from fiction. .
“Election integrity”
Audio manipulations generate the most fear, because they represent the “most vulnerable” point, estimates Tim Harper, analyst at the Center for Democracy and Technology.
“It is easy to clone a voice using AI and difficult to spot,” he told AFP.
Because AI software is proliferating faster than that dedicated to fake detection.
AI intervenes in an already tense and extremely polarized political landscape, allowing for example anyone to claim that information is based on “fabricated” facts, adds Wasim Khaled, general manager of Blackbird.AI., an online disinformation detection platform.
The Chinese company ByteDance, owner of TikTok, recently unveiled StreamVoice, AI software that can transform its user’s voice into any other in real time.
“ElevenLabs was used this time, but it is more than likely that a different generative device will be used in future attacks,” says Vijay Balasubramaniyan, pleading for the creation of “safeguards”.
Like other experts, he recommends the integration of audio watermarks or digital signatures in this software as protection but also as a regulation to reserve them for verified users.
Even under these conditions, controls risk being “really difficult and very expensive,” says analyst Tim Harper, who calls for “investing” more in the face of this risk for “the integrity of elections.”