Opinion: Malaysians deserve better regulatory safeguards against data brokers

This article first appeared in Digital Edge, The Edge Malaysia Weekly, on January 18, 2021 - January 24, 2021.
A branch of the US military, thought to be focused on counterterrorism and counter-insurgency, bought data off a Muslim dating app through third party brokers

A branch of the US military, thought to be focused on counterterrorism and counter-insurgency, bought data off a Muslim dating app through third party brokers

-A +A

Last month, news surfaced that the personal data of millions of users of a Muslim prayer app, Muslim Pro, had been sold to the US military without the knowledge of its users. The app has millions of users around the world, including in Malaysia.

A Muslim dating app, and several other innocuous and seemingly unconnected smartphone apps, were also amongst those revealed to be part of the supply chain of data being purchased by the US military through various third-party data brokers.

The independent investigation carried out by Vice Media suggested, disturbingly, that this highly personal and sensitive information may specifically have been obtained by a branch of the military focused on counterterrorism and counterinsurgency.

Most people would agree that the application of this type of information for such a purpose is invasive and contentious, but whether or not the use case is ethical is only one of the questions that should be raised.

A bigger question is how this is structurally able to happen (for the most part inconspicuously) at scale, every day and in many different contexts aside from those connected to this particular US military purchase — and who profits from this structure.

This is certainly not an isolated case. The data brokering industry comprises a huge network of suppliers globally and is extremely lucrative, estimated to be worth around US$200 billion (RM808.5 billion). The Muslim Pro discovery highlights just how critical a role data brokers play in the life cycle and movement of our personal data, yet simultaneously, how obscure they are — hidden from public scrutiny, government, us as data subjects and, ostensibly, even the developers of the very apps from which they obtain data.

Who or what exactly are data brokers? Gartner defines a data broker as “a business that aggregates information from a variety of sources; processes it to enrich, cleanse or analyse it; and licenses it to other organisations”. In simple terms, these are business-to-business services that profit from trading people’s data. They act as the middlemen between companies, for example those developing apps on one side, and companies that want to access data about the users of those apps, on the other.

Although people are generally aware when they install an application that their data will be stored and used in the context of that application, most of us are not conscious of the extent to which our data is being traded and accessed by other organisations that have nothing to do with that application.

The data footprints we generate through everyday acti­vities on our smartphones alone are highly sensitive. Brokers of location data routinely declare that this data is anonymised, yet an investigation by the New York Times revealed that the applications collecting these data often do so to such a precise level of detail and in such high volume (in some cases, recording users’ locations 14,000 times a day), that people can reasonably be de-anonymised by anyone who has access to this data.

This means that individuals, their precise movements and far too many other intimate details of their lives could be pinpointed with access to this data, which is commercially available to whomever is willing to pay for it.

Alarmingly, there is barely any regulatory oversight of the role of data brokers — certainly not in Malaysia, but this is also the case in most other countries. In fact, their activities are not transparent even to governments.

In the Malaysian context, this is made more disturbing by the lack of a robust legislative framework for protecting citizens’ personal data. The Personal Data Protection Act (PDPA) was enacted 10 years ago, which means it is inadequate considering the developments in data mining technology that have taken place since then.

The PDPA was, in fact, due to be reviewed by the government this year. This has apparently been underway for several months. However, the status of this review is not transparent. In the context of the pandemic, citizens are subject to elevated levels of surveillance for the purpose of contact tracing and monitoring population movement (for example, through the MySejahtera and Gerak Malaysia apps). In this context, a modernisation of the PDPA is a matter of urgency.

Aside from strengthening the rights of individuals over their own personal data footprints, it is also high time we started building a regulatory framework around the data brokering industry.

Trading in the financial industry is highly regulated — Bursa Malaysia supervises and regulates stock and futures brokers. We must realise that our digital footprints have become commodified, detached from the physical beings they represent, and traded much in the same way as stocks and futures.

Yet in Malaysia, those making vast amounts of money off our personal data operate in a “Wild West” with few legal limitations and no government oversight. If we have a dedicated regulatory body to supervise stockbrokers’ activities, why not install a similar set-up for specifically overseeing the data brokering industry, at least locally?

In terms of policy, one idea worth considering is requiring brokers who deal in buying and selling third-party personal data to register every year with a government body. This novel legislation was introduced in two US states in 2019 and there has been some push for it to be regulated nationally there by the Federal Trade Commission. The level of detail that these brokers are required to hand over to the government could be improved, but this is a highly commendable step in the right direction.

Beyond requiring brokers to register with an authority, another measure that would help safeguard citizens’ interests would be to regulate who can license data from brokers. The data brokering business is highly commercialised, so money talks.

This could mean that anyone, for example a private individual or group with unsavoury motivations — with enough financial backing — could theoretically license personal data from brokers and use it for private purposes as long as it is within the scope of the licensing agreement.

The Muslim Pro example reveals that governments, both local and foreign, can easily do the same. Do we want to leave it up to the data brokers themselves to look out for the public’s best interests? How can we ensure they do their due diligence on who buys licences, and that our data is used in legitimate, non-harmful ways? This is where a regulator absolutely must step in and safeguard citizens’ interests proactively.

Almost always, legislation that should be designed to protect people’s data from being used in ways that we are unaware of has a massive pitfall: consent. Here is a clear example: the European Union is hailed as being a global pioneer in personal data protection. Yet its hallmark General Data Protection Regulation (GDPR) states that a citizen cannot be subject to decisions based purely on automated decision-making, including profiling, unless that person has given explicit consent (Article 22).

This may sound logical but in reality, what this “consent” boils down to (in the context of installing an app or signing up to a website) is agreeing to a privacy policy. Most of us have probably unwittingly signed hundreds of these without reading them, and this is no secret to the companies who create them. In fact, persuading people to click the “I accept” button and sign away these rights is a money-making business in itself.

Privacy International has shed light on how companies like Quantcast sell “consent solutions” to other websites and apps, designed to incentivise us to provide our consent as efficiently as possible. What makes it clear that genuine informed and free consent is not their ultimate benevolent goal is that Quantcast boasts a 90% consent rate as its selling point on its own website.

Yet, the onus to safeguard our data is routinely placed on the individual. Is this a reasonable burden in a world where so many of the services we rely on to function in our daily lives are digitally mediated and demand the installation of third-party applications, or at the very least require regularly giving away personal details such as names, addresses and identification card numbers?

More than that, is this reasonable when important decisions are increasingly being made about us by private companies and government bodies alike, in data-driven ways, based on the non-transparent collection of intimate information about us?


Anisha Nadkarni is the tech policy research fellow at the Social & Economic Research Institute Malaysia (SERI) and an alumnus of the Oxford Internet Institute