TikTok and Facebook validated advertisements containing false claims about the American election a few weeks before the vote, according to an investigation Thursday by an NGO which questions the platforms’ rules for detecting disinformation.
• Also read: Harris promises a different presidency from Biden
• Also read: Musk donated $75 million to Trump’s presidential campaign
Global Witness submitted eight advertisements containing false claims, such as the possibility of online voting, to the TikTok, Facebook and YouTube platforms in order to test them just before the presidential election on November 5.
TikTok, which let four through despite its policy of banning political ads, obtained “the worst results,” according to Global Witness. Facebook approved one of the eight.
So close to such an undecided electoral deadline between Democrat Kamala Harris and Republican Donald Trump, “it is shocking that social networks continue to validate disinformation content,” assured Ava Lee, who leads campaigns against digital threats at the house of Global Witness. “In 2024, everyone (…) knows how important it is to have qualitative moderation systems.”
AFP photos
These platforms “have no excuse” for “continuing to endanger the democratic process,” she added.
A TikTok spokesperson told AFP that four of these advertisements “were approved in error at the first level of moderation” and that the application – which belongs to a company based in China – “will continue to enforce” its policy of banning political ads.
At Meta, which owns Facebook, a spokeswoman disputed the study’s finding, saying it was based on a small sample of ads and therefore “does not reflect how (Meta) enforces its large-scale rules.
YouTube, which belongs to Google, initially validated half of the proposed advertisements but then blocked their publication until proof of identity was provided. It is a “considerably more robust obstacle” against disinformation, judged Global Witness.
Google said on Thursday that it would “temporarily pause advertisements” linked to the US elections after the polls closed on November 5, as it did in 2020 during the presidential election between Mr. Trump and Democrat Joe Biden .
The tech giant took this measure “as a precaution,” explaining that vote counting could continue for several days after the election.
Meta, for its part, announced that it would block all new political advertising during the last week of the electoral campaign.