Platform
TikTok Reorganizes Content Moderation Unit, Reduces Workforce
TikTok has begun restructuring its trust and safety division, initiating layoffs across its global content moderation teams.
According to Reuters, citing sources familiar with the matter, operations head Adam Presser announced the restructuring through an internal memo to staff, with workforce reductions beginning in Asia, Europe, Middle East, and Africa regions.
The Layoffs Continue
The reorganization follows TikTok’s October 2024 workforce reduction, which included substantial staff cuts in Malaysia as the platform increases its focus on artificial intelligence for content moderation.
The company, which reportedly has 40,000 trust and safety professionals worldwide, previously committed $2 billion to trust and safety efforts, as announced by CEO Shou Chew during a Congressional hearing in January 2024.
The changes occur amid ongoing regulatory scrutiny in the United States, where TikTok faces potential restrictions under recent legislation. The platform, used by 170 million Americans, experienced a brief outage last month before the implementation of a law on January 19 that requires parent company ByteDance to either sell the app or face a ban due to national security concerns.
This restructuring represents the latest development in TikTok’s approach to content moderation, following Congressional testimony where tech industry leaders, including Chew and Meta’s Mark Zuckerberg, addressed platform safety concerns, particularly regarding child protection measures.