U.K News
‘Loophole’ in law on messaging apps leaves children vulnerable to sexual abuse, says NSPCC

The Alarming Rise of Child Sex Abuse Image Crimes: A Call to Action
The NSPCC has sounded the alarm over a disturbing surge in child sex abuse image crimes, with nearly 39,000 such offenses recorded in England and Wales in the 2023/24 financial year. This staggering figure translates to more than 100 crimes reported every single day, highlighting the vulnerabilities children face online. Snapchat emerged as the most frequently cited platform in these cases, with 50% of the crimes linked to the app. Other platforms, including Instagram, Facebook, and WhatsApp, were also implicated, albeit to a lesser extent. The NSPCC warns that the secrecy offered by one-to-one messaging services is being exploited by predators to harm children and evade detection.
The Troubling Role of Messaging Apps and End-to-End Encryption
The NSPCC points to a concerning trend where private messaging apps, particularly those using end-to-end encryption, have become breeding grounds for child sexual abuse material. These platforms, while offering privacy to users, often leave children exposed to exploitation. End-to-end encryption, which prevents even the companies themselves from viewing messages, creates a "blind spot" for detecting and removing illegal content. This has led to calls for tech companies to take greater responsibility for ensuring their platforms are not inadvertently harboring abusive activities. Snapchat, in particular, has been flagged for its role in facilitating these crimes, with one victim recounting how a stranger she met on the app coerced her into sending explicit content and later threatened to share it online.
A Critical Loophole in the Online Safety Act
Despite the passage of the Online Safety Act in 2023, which aims to make the UK the safest place for children online, the NSPCC and other charities, such as Barnardo’s, argue that a critical loophole undermines its effectiveness. The act requires social media firms to reduce illegal and harmful content, but the accompanying codes of practice, enforced by Ofcom, only mandate the removal of such content if it is "technically feasible." This vague provision effectively gives tech companies a "get-out clause," allowing them to sidestep responsibility for protecting children if they claim it is not feasible to remove harmful material. The Internet Watch Foundation (IWF) has also criticized this loophole, calling it a "blatant excuse" for platforms to avoid accountability.
A Victim’s harrowing story: The Human Cost of Inaction
The NSPCC shared the story of a 13-year-old girl who fell victim to exploitation on Snapchat. She recounted how she sent explicit photos and videos to a stranger she believed was in his thirties. When she tried to stop, the man threatened to post the images online, leaving her paralyzed with fear and uncertainty. This harrowing account underscores the real-world consequences of the failure to protect children online. Such stories are a stark reminder of the urgency with which stronger safeguards are needed to prevent these crimes and support victims.
Calls for Urgent Action from Charities and Campaigners
The NSPCC and other charities have directly appealed to the home secretary and the technology secretary to strengthen the implementation of the Online Safety Act. They argue that the current framework allows tech companies to shy away from implementing robust protections for children, enabling predators to continue exploiting these platforms. NSPCC chief executive Chris Sherwood described the situation as "deeply alarming," emphasizing that separate rules for private messaging services effectively let tech bosses "off the hook" from fulfilling their responsibility to protect children. The charities are advocating for clearer and more enforceable measures to ensure that platforms are not used as "safe havens" for perpetrators.
The Government’s Commitment and the Road Ahead
The government has reiterated its commitment to tackling child sexual abuse online, calling the crime "despicable" and "devastating." A spokesperson emphasized that UK law is clear: child sexual abuse is illegal, and social media companies must ensure their platforms do not enable such criminal activity. The government has already introduced four new laws aimed at combating child sexual abuse online, but it acknowledges that more must be done. Tech companies’ design choices, such as end-to-end encryption, cannot be used as an excuse for failing to root out these crimes. Ofcom has also stated that while the current codes of practice require action only where it is "technically feasible," it expects most platforms to comply and will hold them accountable if they fail to do so.
As the battle to protect children online continues, the focus must remain on closing loopholes, strengthening regulations, and ensuring that tech companies take responsibility for safeguarding their users. The fight against child sexual abuse requires urgent, collective action to prevent further harm and ensure that the digital world is a safe space for all children.
-
Tech2 days ago
Canon’s New Camera Is in a Category Once Thought Practically Dead
-
Entertainment6 days ago
Khloe Kardashian Says Mom Kris Jenner ‘Gets Mad at Me’ for Wearing ‘Baggy Sweats’ Out of the House
-
Money6 days ago
Cal Newport’s Productivity Hack That Can Also Help You Escape Financial Burnout
-
Tech7 days ago
Best Internet Providers in Cincinnati, Ohio
-
Sports3 days ago
Chargers to play 2025 regular season opener in Brazil
-
Tech5 days ago
Best AirPods Max Accessories for 2025
-
World7 days ago
How to Watch USA vs. Cuba: Live Stream 2025 Concacaf U-17 Men’s Qualifiers, TV Channel
-
Tech2 days ago
Best Vitamins for Healthy Hair, Skin and Nails in 2025