On Wednesday, TikTok said it removed 347,225 videos in the U.S. over the second half of 2020 for posting “election misinformation, disinformation, or manipulated media,” according to a new “transparency report” from the app.
TikTok didn’t offer a ton of insight into how the company went about determining whether videos were spreading false election claims. The company’s report defined misinformation as “content that is inaccurate or false,” and said TikTok worked with fact checkers at PolitiFact, Lead Stories and SciVerify to “assess the accuracy of content and limit distribution of unsubstantiated content.” Whether the bogus claims were more in favor of Donald Trump or Joe Biden is unclear, with the report failing to mention either the president or the ex-president by name.