Connect with us

Asia

Malaysian government in talks with TikTok to restore 18 media outlets’ accounts blocked by AI: Minister

Published

on

20240805 afp fahmi fadzil

Malaysian Government Engages TikTok to Restore Blocked Media Accounts

Overview of the Situation

The Malaysian government has initiated discussions with the popular social media platform TikTok to address the sudden blocking of 18 accounts belonging to local media outlets. According to Communications Minister Fahmi Fadzil, the accounts were suspended due to TikTok’s artificial intelligence (AI) moderation system. Preliminary reports indicate that the suspension occurred following the media outlets’ coverage of an alleged sexual assault case involving a young girl at a mosque in Selangor on February 21, 2024. As of Monday, February 26, the accounts remained inaccessible, including those of prominent outlets like Bernama, Malaysia’s national news agency, and Buletin TV3, a leading television news channel. The government has urged TikTok to restore the accounts and provide a detailed explanation for the suspension to both the authorities and the affected media organizations.

The Incident and Its Viral Impact

The case in question involves a disturbing incident captured on closed-circuit television (CCTV) footage at a mosque in Batang Kali, Selangor. The video, which quickly went viral over the weekend, shows a man sneaking into the women’s prayer area and attempting to carry a young girl away without being noticed. The suspect, aged 19, is currently under a seven-day remand to assist the Selangor police with their investigation. The case is being investigated under Section 14 of the Sexual Offences Against Children Act 2017, which carries severe penalties for offenses against minors.

The Media’s Role in Reporting Sensitive Cases

Communications Minister Fahmi Fadzil emphasized the importance of media organizations reporting on such incidents, highlighting that sexual assault cases are inherently newsworthy and should not be censored. He explained that while AI moderation tools are designed to filter out inappropriate or harmful content, they often fail to distinguish between reporting by professional media outlets and content created by individual users. “The problem is TikTok’s artificial intelligence itself… AI can sometimes go too far and not understand that media organization reporting is different from the content produced by ordinary people,” Fahmi said during the launch of the “AI in the Newsroom” program organized by Bernama in Kuala Lumpur.

The Broader Implications for Press Freedom

The blocking of media accounts has sparked concerns about press freedom and the role of AI in content moderation. While AI systems are increasingly used to enforce platform policies and reduce harmful content, they often lack the nuance to understand the context of media reports. In this case, TikTok’s AI mistakenly flagged the accounts of reputable news organizations for violating community guidelines, likely due to the sensitive nature of the content they were reporting. This highlights the challenges of relying solely on automated systems to regulate online content and the need for human oversight to ensure that legitimate news reporting is not erroneously censored.

Government and Media Collaboration on AI in Journalism

The incident has also brought attention to the growing integration of AI in newsrooms and the potential risks associated with over-reliance on automated systems. During the launch of Bernama’s “AI in the Newsroom” initiative, Fahmi Fadzil called for greater collaboration between the government, media organizations, and tech platforms to address these challenges. The program aims to explore how AI can enhance journalism while minimizing its risks, such as the unintended censorship of legitimate news content. By fostering dialogue and developing clearer guidelines, stakeholders hope to strike a balance between promoting free expression and maintaining a safe online environment.

The Path Forward for TikTok and Media Organizations

As the Malaysian government continues its discussions with TikTok, the focus remains on restoring the blocked accounts and ensuring that similar incidents do not recur. The incident serves as a reminder of the complexities of content moderation in the digital age and the need for transparency and accountability from social media platforms. Media organizations, policymakers, and tech companies must work together to develop solutions that protect both the public and press freedom. In the meantime, the viral spread of the assault video and the subsequent suspension of media accounts underscore the delicate balance between combating harmful content and preserving the integrity of news reporting.

Advertisement

Trending