TikTok overhauls UK moderation processes in alignment with its global AI-centric strategy
In a recent announcement, TikTok, the popular social media platform owned by Chinese company ByteDance, revealed plans to restructure its content moderation operations on a global scale, including in the United Kingdom.
The restructuring does not signal any changes in TikTok's ownership or control, nor does it involve the sale or transfer of TikTok's UK operations to another entity. Instead, it represents a growing trend among tech companies to reduce dependence on human labor in favor of automated solutions.
Under this restructuring, TikTok aims to improve the efficiency and effectiveness of its content moderation processes. As of July 25, digital platforms in the UK are required to implement strict safeguards to protect minors from inappropriate content under the Online Safety Act. This legislation mandates the removal of material promoting eating disorders, suicidal ideation, and self-harm.
The initiative to expand TikTok's reliance on artificial intelligence technologies is a global one. Currently, 85 percent of content removals for rule violations on TikTok are now carried out automatically by its systems. Some moderation tasks will remain in the UK, while others will be handled by automated solutions.
The restructuring effort includes several Asian countries, such as Malaysia, but the search results do not provide information on who will lead the newly organized content moderation on TikTok in these regions. Affected employees will be given priority for internal opportunities.
Content moderators on TikTok are tasked with removing prohibited material, including hate speech, misinformation, and explicit content. The Online Safety Act in the UK sets a high standard for content moderation, and TikTok's restructuring aims to meet and exceed these expectations.
While the details of the restructuring are still being finalized, TikTok remains committed to providing a safe and enjoyable platform for its users. The company continues to work closely with regulators and industry partners to ensure that its content moderation practices are effective, efficient, and in line with the highest standards of user safety.