Here’s a privacy parable with a theological twist: Three legal compliance officers discuss data privacy issues at a downtown bar. “I always wondered whether God or a divine being would have access to all information of all his devotees,” the first legal executive says. “Or will the divine being have access to all data of all human beings, irrespective of worshippers? There is, of course, no question of any opt-in.”
“There are no data privacy laws applicable to any entity that’s considered a divine being,” the second legal eagle pontificates, taking a swig of beer. “The point of contention is whether the divine being has access to all his devotees’ PII (personally identifiable information) while they’re at the place of worship or anywhere else. My point is, should we even be discussing God while sitting in a bar drinking beer?”
The third legal officer calls for another round of drinks and strokes his chin. “I think we’re doing the right thing,” he says. “It’s better to sit in a bar thinking about God than to sit in a church thinking about booze.”
With due respect to worshippers and lawyers, the question of data privacy is suddenly on everyone’s minds, including government agencies, companies and citizens. Proliferating breaches and consumers’ demand for privacy and control of their own data have led governments to adopt new regulations, such as the Personal Data Protection Act (PDPA) that is applicable in Malaysia and Singapore, the General Data Protection Regulation (GDPR) in Europe, and the California Consumer Privacy Act in that US state. Many others are following suit.
About 10% of internet users worldwide (and 30% in the US) deploy ad-blocking software to prevent companies from tracking online activity. About 87% of respondents told management consulting firm McKinsey in a recent survey that they would not do business with a company if it had lax data security practices — and 71% said they would stop doing business with a company if it gave away sensitive data without permission.
“Because the stakes are so high — and awareness of these issues is growing — the way companies handle consumer data and privacy can become a point of differentiation and even a source of competitive business advantage,” McKinsey says. “Consumers are most comfortable sharing data with providers in healthcare and financial services; no industry reached a trust rating of 50% for data protection.”
The scale of breaches is staggering. “In two breaches at one large corporation, more than 3.5 billion records were made public,” McKinsey notes. “Breaches at several others exposed hundreds of millions of records. The stakes are high for companies —even consumers who were not directly affected by these breaches paid attention to the way companies responded to them.”
It’s no surprise that the market for data privacy management software is soaring; sales of such software jumped 46.1% in 2020 over 2019. Market intelligence firm IDC estimates that data privacy management software sales will reach US$2.3 billion in 2025, nearly double 2020 revenues, growing at a 14.3% annual clip during the period.
“It feels like a broken record when discussing data privacy regulations because every year data privacy regimes continue to grow,” says Ryan O’Leary, IDC’s research manager for privacy. “The frameworks and regulations that enterprises need to manage continue to explode. End-users struggle with the sheer volume of regulations and need help with the regulatory change management from software providers.”
The paradox? The pandemic has led to a surge in data being generated and consumed by employees working at home. In contrast, data privacy regulations have become stricter. “Data visibility continues to be a blind spot for many organisations,” O’Leary says. “There is a growing demand for automated data discovery and classification tools that scan for sensitive data across both cloud and on-premise environments to provide that single source of data truth. Solving the challenge of patchwork enterprise infrastructure and automation is the golden ticket in data privacy.”
With more companies deploying artificial intelligence (AI), there is now a shift to small data and wide data. “Small data” requires less data but still offers valuable insights. It includes specific time-series analysis techniques, synthetic data or self-supervised learning. “Wide data” enables analysis of various small, large, unstructured and structured data sources. It attempts to find links between data sources across formats, including tabular, text, image, video, audio, voice, temperature, smell and vibration.
Where can “small” and “wide” data be used? Demand forecasting in retail, real-time behavioural and emotional intelligence in customer service, physical security, fraud detection and adaptive autonomous systems, such as robots, which constantly learn by analysing correlations in time and space of events.
“By 2025, up to 70% of organisations will shift their focus from big to small and wide data, providing more context for analytics and making AI less data-hungry,” says Jim Hare, distinguished research vice-president at research and advisory company, Gartner. “Disruptions such as the pandemic [are] causing historical data that reflects past conditions to become obsolete, which is breaking many production AI and ML (machine learning) models. Decision making by humans and AI has become more complex and reliant on data-hungry approaches.”
The most robust privacy protections come from the GDPR, which came into effect across the European Union (EU) from May 2018. GDPR became a model for many national laws outside the EU, including Chile, Japan, Brazil, South Korea, Argentina and Kenya. The GDPR mandates that the end-user’s consent should be valid, freely given, specific, informed and active.
“However, the lack of enforceability regarding obtaining lawful consents has been a challenge,” says Wikipedia. “As an example, a 2020 study showed that Big Tech — Google, Amazon, Facebook, Apple and Microsoft (GAFAM) — use dark patterns in their consent obtaining mechanisms, which raises doubts regarding the lawfulness of the acquired consent.”
Why do companies fear the GDPR? Here’s an example: In July 2019, the British Information Commissioner’s Office (ICO) issued an intent to fine British Airways (BA) a record £183 million (1.5% of turnover) for lax security arrangements following a breach that affected 380,000 transactions. However, BA’s fine was reduced to £20 million; the ICO noted that it had “considered representations from BA and the economic impact of Covid-19 on its business before setting a final penalty”.
Malaysia was among the first countries in Asean to initiate the PDPA in 2010; it was passed in November 2013 to protect the PII of individuals for commercial transactions. The penalty for non-compliance can range from RM100,000 to RM500,000 and up to three years’ imprisonment.
Singapore’s PDPA Act 2012 came into effect in July 2014 and was updated in November 2020. Singapore residents can also register their local phone numbers with the Do Not Call Registry to opt out of receiving unwanted telemarketing messages. Organisations that breach PDPA regulations may be fined up to S$1 million and suffer reputation damage.
What about cross-border data flow? The Asia-Pacific Economic Cooperation (Apec) formulated the Cross-Border Privacy Rules (CBPR) in 2005 and updated them in 2015. “The CBPR benefits consumers and business by ensuring that regulatory differences do not block businesses’ ability to deliver innovative products and services,” Apec states. “Developed by all 21 Apec economies, an Apec economy must demonstrate that it can enforce compliance with the CBPR system’s requirements before joining.”
Malaysia and Singapore are Apec members. Australia and Taiwan are the latest economies to participate in the CBPR; they join Canada, Japan, South Korea, Mexico, Singapore and the US.
The bottom line: How private should you be? The simple answer: As private as you need to be. Why? Identity theft cost US$56 billion last year in just the US, with about 50 million consumers falling victim, notes Javelin Strategy & Research. About US$13 billion in losses were due to traditional identity fraud, in which criminals stole PII through data breaches. But the bulk of US$43 billion was due to criminals phishing to steal PII via robocalls and emails. Protect your virtual self as you would your physical self.
Since I started this column on an alcoholic note, let me also end on one. After a night out boozing, three drunks hailed a cab and mumbled a destination. The cab driver drove around the curb and stopped. “We have reached,” the driver announced. “That will be $45.” The first drunk gave him a $50 note and said, “Keep the change.” The second drunk thanked the driver for getting them home so fast. The third drunk slapped the driver. The driver thought he was caught and bowed his head. “That was a warning to you,” the third drunk said. “Next time, don’t drive so fast. You nearly killed us!”
The writer is vice-president of new technologies at Fusionex International, Asia’s leading big data analytics company