Wednesday 24 Apr 2024
By
main news image

This article first appeared in Digital Edge, The Edge Malaysia Weekly on June 27, 2022 - July 3, 2022

The prevalence of child sexual abuse material (CSAM) on messaging channels is on the rise despite the variety of measures being taken to stop the nefarious activity.

In 2021 alone, 85 million pictures and videos suspected of CSAM were reported, says a European Commission press release, which raises the sickening concern of the amount going unreported. The Covid-19 pandemic has worsened the problem, with the Internet Watch Foundation noting a 64% increase in reports of confirmed child sexual abuse in 2021 compared with the previous year.

The commission argues that the current system based on voluntary detection and reporting by companies has proven to be insufficient to adequately protect children and, in any case, will no longer be possible once the interim solution currently in place expires.

So, to weed out the perpetrators, a new European Union (EU) legislation is being proposed to make it mandatory for service providers to report child sexual abuse online and report cases of solicitation of children, known as grooming, on their platform to the authorities.

“In many cases, the abuse comes to light only when the actions of perpetrators are detected online. Online service providers play an essential role in reporting online child sexual abuse. Unfortunately, the current system of voluntary reporting is not effective enough,” the commission says on its website. “As a society, we need to make it mandatory for all tech companies in Europe to detect and report child sexual abuse online to the authorities.”

However, critics have been raising their concerns regarding the state of privacy if this proposed legislation becomes law.

“One of the reasons encryptions are effective is because it takes so long to decrypt. You have to run hashes against it again and again, until you break down the encryption,” explains Rachel Gong, senior research associate at Khazanah Research Institute.

The EU has an influence on digital policies around the world, and if the proposal were to become reality, the same measures would trickle down globally, even to authoritarian states, which could affect freedom of speech.

“In the world that we live in today, we give up a lot of privacy. The real risk is for people who are political dissidents, social activists, or those from a minority or a persecuted group,” says Gong.

Without end-to-end encryption, an authoritative regime could conduct state surveillance for anti-government keywords that are related to revolutions or protests, in the same way keywords related to CSAM would be sought out.

In a series of tweets, Will Cathcart, vice-president of WhatsApp, expressed his dissatisfaction and hesitancy about the proposed EU law, which would fail to protect end-to-end encryption.

“Legislators need to work with experts who understand internet security so they don’t harm everyone, and focus on the ways we can protect children while encouraging privacy on the internet,” Cathcart commented.

According to US-based technology news website The Verge, the proposal could jeopardise end-to-end encryption as it is requiring service providers to install systems the EU deems necessary, which risks leaving the door open for more generalised surveillance.

Messaging channels employ end-to-end encryption as a safety measure to safeguard privacy when two users are communicating with each other, ensuring messages that are sent can be seen and heard only by the two users and no one in between.

The proposed law could prove ineffective as perpetrators quickly adapt to legislative changes, and they would likely adjust their vocabulary in order to waltz past security measures, says Gong.

For example, Tik Tok has an algorithm that scans for certain words such as anti-vaccine or kill. Users realised this and adapted, and instead of using the words that have been marked, they used codewords instead. Gong suspects that abusers would do the same and adapt the language they are using to skirt the law.

How do we move forward with managing CSAM on messaging channels while safeguarding privacy? “It’s a judgement call, I think, and unfortunately, I don’t have a good answer for how best to do that,” says Gong. “You have this criminal stuff which is problematic, and on the other hand, you have activist groups that are trying to organise and have discussions, while being afraid of being targeted by the state.”

Child sexual abuse persists as it is a social problem, and should be treated as such instead of brushing away the sociological responsibility by solely implementing or cutting off technology.

“It goes without saying that CSAM and the abusers who traffic in it are repugnant. There is much governments and technology companies can do to combat abuse,” says Cathcart in a tweet.

“But far too often, government approach the challenge by trying to weaken privacy and security, instead of strengthening it,” he adds.

Save by subscribing to us for your print and/or digital copy.

P/S: The Edge is also available on Apple's AppStore and Androids' Google Play.

      Print
      Text Size
      Share