The advocacy group Global Witness submitted eight ads containing false election claims to the Chinese-owned video-sharing app TikTok, the Meta-owned Facebook, and Google-owned YouTube to test their ad systems in the final stretch of the November 5 election.
The ads carried outright election falsehoods -- such as people can vote online -- as well as content promoting voter suppression, inciting violence against a candidate, and threatening electoral workers and processes.
TikTok "performed the worst," Global Witness said, approving four of them despite its policy that prohibits all political ads.
Facebook approved one of the ads submitted.
"Days away from a tightly fought U.S. presidential race, it is shocking that social media companies are still approving thoroughly debunked and blatant disinformation on their platforms," said Ava Lee, the digital threats campaign leader at Global Witness.
The study comes as researchers warn of the growing perils of disinformation -– both from domestic actors and foreign influence operations –- during a tight election race between the Democratic contender, Vice President Kamala Harris, and Republican nominee Donald Trump.
"In 2024, everyone knows the danger of electoral disinformation and how important it is to have quality content moderation in place," Lee said.
"There's no excuse for these platforms to still be putting democratic processes at risk."
Growing scrutiny
A TikTok spokeswoman said four of those ads were "incorrectly approved during the first stage of moderation."
"We do not allow political advertising and will continue to enforce this policy on an ongoing basis," she told AFP.
A Meta spokeswoman pushed back against the findings, saying they were based on a small sample of ads and therefore "not reflective of how we enforce our policies at scale."
"Protecting the 2024 elections online is one of our top priorities," she added.
Global Witness said the ad approved by Facebook falsely claimed that only people with a valid driver's license can vote.
Several U.S. states require voters provide a photo ID, but do not say that it must be a driver's license.
Global Witness said YouTube initially approved half of the ads submitted but blocked their publication until formal identification, such as a passport or driver's license, was provided.
The watchdog called that a "significantly more robust barrier for disinformation-spreaders" compared to the other platforms.
Platforms are facing growing scrutiny following the chaotic spread of disinformation in the aftermath of the 2020 election, with Trump and his supporters challenging the outcome after his defeat to Joe Biden.
Google on Thursday said it will "temporarily pause ads" related to the elections after the last polls close on November 5.
The tech giant said the measure, also introduced during the 2020 election, was expected to last a few weeks and was being implemented "out of an abundance of caution and to limit the potential for confusion," given the likelihood that vote counting will continue after Election Day.
Separately, Meta has said it will block new political ads during the final week of the election campaign.