AI: Finding a balance between personalisation and privacy

This article first appeared in Enterprise, The Edge Malaysia Weekly, on July 9, 2018 - July 15, 2018.
-A +A

Artificial intelligence (AI) is the concept of machines being able to carry out tasks that require human intelligence. Machine learning (ML) makes it possible for machines to learn from experience and adjust to new input to perform human-like tasks. The two go together and today, are helping to reinvent businesses, streamline production and transform the skillsets of the next generation of workers.

Data is instrumental in designing better AI systems. The more data is gathered and analysed, the more accurate the system. This means AI requires an enormous pool of data to be smart enough to reach its objective, and businesses will need to collect, store and analyse a huge amount of data for that purpose.

This, however, can prove tricky because consumers are not confident about sharing their data or of having it stored. A recent survey by US-based analytics company SAS found that only a third (35%) of 500 consumers were confident that their personal data used for AI was stored securely.

David Tareen, marketing manager for AI at SAS, tells Enterprise that consumers are wary of sharing their data because they know it can be a double-edged sword. On the one hand, the more data they are willing to share, the more capability and personalisation they can unlock. On the other, it leaves them vulnerable to the loss of personal privacy, discrimination and even identity theft.

Tareen was speaking on the sidelines of the SAS Global Forum 2018 in Denver, Colorado, recently.

“I think what the industry needs right now is better regulation on data collection and usage to make sure that it is used properly. For example, there should be standards and regulations that prohibit an insurance company from denying insurance coverage based on one’s biometric data,” says Tareen.

He adds that while there are no international or US standards currently, there have already been some efforts taken by several countries at different levels. “Right now, in most parts of the world, there are no regulations governing how companies protect and use their data. It really depends on what consumers opt in and opt out of when they are signing the company’s terms and conditions.

“China’s Alibaba, Baidu and Tencent, for example, have internal initiatives on data privacy, but they differ significantly from company to company. Recently, however, we have seen a lot of initiatives taken around the world to standardise data protection, such as Europe’s General Data Protection Regulation (GDPR).”

The regulation, which went through seven years of development, came into effect on May 25. It requires businesses to protect the personal data and privacy of European citizens for transactions that occur within European member states, and non-compliance would cost companies dearly. GDPR gives consumers more control over their personal data, apart from forcing companies to make sure the way they collect, process and store data is safe.

In Malaysia, companies observe the Personal Data Protection Act (PDPA), which regulates the processing of personal data in commercial transactions. Here, personal data is defined as information that can identify a person, which includes any expression of opinion about them. This includes their full name, address, MyKad number, passport number, health record, email address, photographs, bank account details and credit card details.

 

Consumers more comfortable with AI in healthcare

Over the past decade, tech companies have invested heavily in AI to revolutionise consumers’ lives for a more AI-centric future, particularly in assisting people with predictions and informed decision-making.

Consumers, however, are still hesitant about giving up their data. Therefore, businesses have struggled to find the right balance between personalisation and privacy, especially in industries where more data would mean better service, such as healthcare, banking and retail.

Surprisingly, consumers are more comfortable with AI in healthcare than in either the banking or retail sectors. According to SAS’ AI survey, 60% of the respondents were comfortable with doctors using AI to analyse their medical information to suggest treatments. About 61% would even let doctors use data from their wearable devices, such as an Apple Watch or Fitbit, to assess their lifestyle and make recommendations.

According to Tareen, the findings suggest that consumers are positive about AI when they believe it is being used for their good. “While data privacy is still a concern among consumers, they are warming up to the idea of using their biometric data in healthcare AI because they are seeing how the technology is able to improve diagnosis and treatment processes. That is why 47% of those surveyed would be comfortable with AI assisting doctors in the operating room,” he says.

Advanced analytics can contribute positively to healthcare in many ways. According to SAS, advanced analytics can be used to measure, track and improve performance more effectively and efficiently; provide accurate forecasting and real-time access to information; and improve health outcomes and patient safety by delivering evidence-based improvements in the quality of care.

One of the largest medical technology companies, Siemens Healthineers, uses SAS’ machine learning and Internet of Things capabilities to analyse urgent data generated by its systems worldwide. The data comes from a variety of devices, such as the magnetic resonance imaging (MRI) and computed tomography (CT) systems. SAS also helps to predict system problems and potential downtime several days in advance of a failure.

When it comes to dollars and cents, however, consumers are less keen on AI. Only 34% of the survey respondents were comfortable with banks using AI to provide financial guidance while only 31% were comfortable with banks using AI to access credit history to make credit card recommendations.

“The survey averaged out the age groups of the respondents, so it may seem that they are collectively unhappy about banks using AI to interact with them. But the results would be quite different if the age groups were broken down,” says Tareen.

“The younger generation, I believe, is more open to sharing information compared with older generations. They are also less concerned — only about 58% of the survey respondents under 40 would worry whether their data is being stored securely, compared with 69% of those aged 40 and above.”

AI has been recognised by the banking sector as an innovative tool that can drive business growth. However, adoption is still in the early stages, with many banks still trying to understand where and how to make sense of the technology. According to a report by Accenture released on April 4, a whopping 76% of the banking C-suite agreed that adopting AI would be critical to their organisation’s ability to differentiate itself in the market, yet only 26% of their employees were ready to work with AI.

One early adopter is Southeast Asia’s largest bank, DBS Bank. In 2016, the bank introduced its first digital-only bank — Digibank — to customers in India. Opening an account is paperless and does not require signatures. Users simply need to download the app and provide their biometrics for verification purposes at any coffee chain outlet that DBS has tied up with.

As 80% of the customer queries are handled by AI, Digibank only needs 10% of the staff of a normal bank. Within 24 months of its launch, the digital bank managed to acquire 1.8 million customers — a staggering number when compared with the 40,000 customers DBS had managed to acquire in its 15 years in that country.

Meanwhile, in the retail sector, the use of AI is also not well-received among consumers. SAS’ survey revealed that only 44% of the respondents were willing to share location information to personalise their shopping experience. When asked if they were comfortable with online retailers using past purchase behaviour to recommend new items, respondents were evenly split — 49% were comfortable while the remaining 51% were not.