As an enormous amount of data – very often personally identifiable information – continues to flow from increasingly diverse sources, companies that collect this data are responsible for protecting and using it ethically. Diverse data privacy regulations across the globe, varied individual perceptions about privacy, data migration to cloud, new legal developments, such as the Schrems II ruling, adoption of emerging technologies and so on, make data handling very complex. What’s more, COVID-19 has introduced new challenges with an unforeseen effect on the way sensitive data is collected and utilized—be it tracking people movement, sharing cross-border data such as health records and employee travel history, or accessing sensitive data beyond secure office networks. Let us see how this landscape has changed in recent times, the various impact vectors, and the different stakeholders and their contributions.
Rising scenarios of data misuse
The rising adoption of artificial intelligence (AI), accelerated digitalization spurred by the pandemic, and ever-growing social media engagement increase the risk of data misuse, making data ethics more relevant. Customers are becoming increasingly aware of their privacy rights, as they understand the implications of sharing their personal data. According to a survey by PwC, 85% of consumers do not trust companies with their data and 76% have negative views about sharing personal information with firms. Unethical data usage exploiting personal attributes like gender, age, race and location, have had legal, reputational, and financial implications on companies, and the risk is bound to increase owing to upcoming regulations across markets such as Brazil’s General Data Protection Law (LGPD), Australia’s Privacy Act, Canada’s Digital Charter Implementation Act, 2020 (DCIA), and the US’ California Privacy Rights Act (CPRA). The Facebook-Cambridge Analytica data scandal is an example of how companies can put data to wrong use and face consequences. Customers do not hesitate to switch to competition if they feel their data will be misused.
According to a report by the World Federation of Advertisers, ethics is three times more important to company trust than competence, and even a 1% increase in brand trust will drive up the market value three-fold. Firms engaging in unethical practices have to bear the cost of huge regulatory fines and long legal battles, which take a toll on a company’s brand image, customer loyalty, and trust.
Below are some scenarios that can lead to data misuse:
Complicated and lengthy privacy policies contain hidden clauses that can breach an individual’s personal space.
Companies use data for unintended or undeclared purposes such as sharing or selling data to third parties, personalized ad targeting without consent, analyzing private conversations from voice assistants, and so on.
As companies embrace AI-driven decisions, biased recommendations can occur due to skewed datasets, erroneous algorithms, lack of human supervision, unethical practices of developers, and more.
Lack of visibility in usage for data exchanged between governments and private firms.
Oversharing of personal data by individuals.
Widespread remote working leading to new security vulnerabilities.
To ensure data privacy, companies currently:
Take action to meet the new privacy regulations and laws but treat it as mere legal compliance
Have detailed privacy policies that are accurate by law but protect only the company
Establish and maintain data privacy and protection as IT programs
Have mandatory training programs on privacy for employees
Safeguarding against unethical data usage
While companies ensure data privacy and protection only for legal compliance, a data ethics program goes deeper and identifies what companies need to do to gain customer trust and increase brand value. Organizations that realize this have taken a number of steps such as appointing a chief data officer (CDO). The rate of CDO appointments grew from 12% in 2012 to 67.9% in 2019 and a CDO’s role as a data ethicist is gaining prominence, a 2019 survey by NewVantage Partners found. Even tech giants like Google, Amazon, and Twitter are putting data privacy at the center of their business objectives.
So, what should a data ethics program comprise?
Data ethics as a core value that ensures accountability from the leadership to the individual level
User friendly presentation of privacy policies
A responsible and explainable AI framework
Privacy frameworks that keep pace with regulations and laws
Training programs relevant to business functions and data collected
While businesses play a major role in this responsibility, governments and individuals should do their part as well to create a win-win situation.
Governments can enforce ethics by making companies abide by the law, regulate excessive data collection on the premise of the pandemic, create standard data ethics frameworks to help companies innovate without the fear of non-compliance, and raise public awareness. Individuals on their part should act responsibly when they share data on social networks or websites.
Customer trust is the cornerstone for businesses to thrive and companies should make genuine efforts for ethical data handling, to create outcomes beneficial for themselves and their customers.