telecom

New WhatApp’s Child Porn Problem: Chat App Blamed for Failing to Remove Child Sex Abuse Videos

[ad_1]

Child pornography is being openly and widely shared on WhatsApp Messenger, a research from two Israeli NGOs claims. The messaging app, which was bought by Facebook in 2014, is failing hard to control the problem despite banning thousands of accounts every day. The revelation comes less than two years after a global operation led by Spanish police broke up a WhatsApp child porn sharing ring. Despite the arrest of 39 people as a part of the operation, the sharing of videos and pictures showing child sexual abuse is thriving on the chat app.

According to a report in the Financial Times (FT), the Israeli charities – Netivei Reshet and Screensaverz – were first alerted to the problem by a young man who called their hotline in August to report exposure to such content on WhatsApp. After an extensive investigation, the charities found several easily accessible groups that contained child sexual abuse material. These groups were often listed in many free Android apps available in Google’s Play Store, which claim to offer links to interesting WhatsApp groups.

“It is a disaster: this sort of material was once mostly found on the darknet, but now it’s on WhatsApp,” Netivei Reshet’s Yona Pressburger told FT.

The groups, which were found to have as many as 256 members, were identifiable by their explicit profile pictures or names, including abbreviations such as “cp.” Despite these identifiers, many of the WhatsApp groups with child porn have gone unnoticed by the company’s automated systems. The FT report claims that even though the NGOs notified Facebook about the existence of these groups last month, several were found to be active as recently as this week. Of these, one particular group called “kids boy gay,” had participants with phone numbers from India, Pakistan, Algeria, and the United States, the report adds.

A WhatsApp spokesperson told the publication that it has a zero-tolerance policy around child sexual abuse. The company also claims to actively ban “accounts suspected of sharing this vile content.”

One major issue that has emerged from these findings is the end-to-end encryption offered by WhatsApp. Even though it is meant to maintain the privacy of the app users and shield them from the prying eyes of governments and hackers, the encryption hampers WhatsApp as well as law enforcement’s efforts to monitor the spread of child abuse content. The company had rolled out the encryption support back in 2016.

Experts have suggested that WhatsApp could stop encryption for groups over a certain size to make it easier to monitor the content. One other option being cited is the implementation of a weaker encryption that could allow the company to search these group for illegal content. But for now, the company is sticking to the existing encryption model, it told TechCrunch.

Another problematic area is the lack of sufficient manpower to monitor these groups manually on WhatsApp’s end. The company currently employs a total of 300 people, of which only a limited number of resources are tasked to track illegal activity in the app.

With no immediate encryption fix in sight, WhatsApp will need better human moderation and preemptive tracking of any and all avenues that make it easier for people to discover these groups in the first place.

[ad_2]

Source link

About the author

[email protected]