TikTok's third transparency report under the Digital Services Act (DSA) covers content moderation efforts in the European Union from January to June 2024. The report highlights TikTok's commitment to keeping its platform safe for its 150 million EU users by proactively removing violative content and accounts.
Key Insights
Content and Account Removal: Over 22 million pieces of content, including videos, livestreams, and ads, were removed for violating Community Guidelines and ad policies. Additionally, more than 5 million accounts were banned for rule violations.
Action Against Illegal Content: TikTok received approximately 144,000 reports of illegal content, corresponding to around 100,000 unique pieces. About 29% of these were found to violate policies or local laws and were actioned accordingly.
Investment in Moderation: Automated moderation technology now removes 80% of violative videos, an increase from 62% the previous year. Over 6,000 moderators cover EU languages, including specialized misinformation moderators equipped with enhanced tools.
Ongoing Commitment
TikTok emphasizes that transparency is an ongoing commitment and will continue to inform users of its efforts under the DSA and beyond.