TikTok Purged 347,000 Videos for US Election Misinformation

Another 441,028 clips were removed from TikTok’s recommendations for spreading “unsubstantiated” claims

Trump Biden tiktok
Getty

On Wednesday, TikTok said it removed 347,225 videos in the U.S. over the second half of 2020 for posting “election misinformation, disinformation, or manipulated media,” according to a new “transparency report” from the app.

TikTok didn’t offer a ton of insight into how the company went about determining whether videos were spreading false election claims. The company’s report defined misinformation as “content that is inaccurate or false,” and said TikTok worked with fact checkers at PolitiFact, Lead Stories and SciVerify to “assess the accuracy of content and limit distribution of unsubstantiated content.” Whether the bogus claims were more in favor of Donald Trump or Joe Biden is unclear, with the report failing to mention either the president or the ex-president by name.

Want to keep reading?

Create a free account, or log in with your email below.

 

Gain access to unlimited free articles, news alerts, select newsletters, podcasts and more.

 

Comments