Advertisement

  • News
  • Columns
  • Interviews
  • BW Communities
  • Events
  • BW TV
  • Subscribe to Print
BW Businessworld

Keep Your Alternative Money Data Safe

Policy makers face the challenge of striking the right balance between promoting the benefits of the expanded use of alternative data while ensuring adequate data protection and attention to consumer privacy across the eco-system.

Photo Credit :

What is alternative data?

Alternative data can be loosely defined as financial information that is not typically collected by lending institutions or usually provided by customers while seeking credit. It covers a variety of sources ranging from credit/debit cards, emails, consumer reviews and feedbacks, product reviews and feedbacks, point of sale, sensor logs, satellite, supply chain and logistics movements, social media sentiments, weather forecasts, web data, web traffic, surveys, geo locations.

These data sets are generally large, complex and unconventional, limiting their handling and processing through the traditional software. The data is aggregated, loaded into system by the digital lending institutions to construct and use its proprietary quantitative models for decision making by adding value to the data collected.

What’s the deal with Alternative Data?

Individuals’ interactions in the digital world can result in a large output of data: their interests (the websites they browse), the places they visit (via location on their phone), the people they interact with (via contacts on their phone), and even their communication patterns (based the timing and duration of their calls). These are the various ‘alternative’ data points used to make lending decisions.

The connection between non-financial personal data (such as location history and contacts) and creditworthiness is not obvious or directly linked upon initial analysis. However, this information collected by lending apps serves as a proxy for other data (such as financial history) more commonly used to assess a person’s ability and willingness to repay a loan.

When this alternative data is entered into an algorithm to determine the creditworthiness of an individual, certain data points may have higher predictive value. But this analysis is often undertaken by complex machine learning models with limited transparency. The absence of an explanation for rejection or pricing of a loan raises questions regarding the lenders’ accountability. Due to the nature of the data, it is highly likely that many of the patterns and signals hidden in the alternative data may change or vanish over time thus altering the accuracy of the model. Hence, it is essential that the accuracy of the decisions made by the algorithm.

The regulatory aspect here is to how digital lenders can leverage alternate data

Alternative data may never replace the formal credit sourcing system but it can be an effective equalizer for the underserved. Alternative data will be the future of financial inclusion and banks that leverage it will have an edge over the conventional credit rating systems.

Security

Recently, various consumer advocacy groups and the United States Government Accountability Office (GAO) have expressed concerns about data accuracy and privacy issues arising from the volume and sensitivity of the alternative data collected and used by financial technology companies. The lack of transparency in the models has been criticized.

The opportunity to extract an individual’s data from an “anonymized” set is larger than one might realize especially in the case of mixed data sets. An MIT analysis of three months of credit card data records of 1.1 million people found that any individual can be identified with more than 90% accuracy by looking at just four purchases—even after companies have “anonymized” the records.1 “This is not surprising to those of us who spend our time doing privacy research,” said outside expert Lorrie Faith Cranor, director of the CyLab Usable Privacy and Security Laboratory at Carnegie Mellon University.2

With the rapid expansion of digital lending institutions, it is essential for the usage of data to be regulated.

Discrimination

If consumers are not aware of what specific information the credit scoring systems use and how it impacts their credit score especially when each digital lending platform has its own algorithm, their ability to dispute unfavorable credit decisions, identify inaccuracies in the lender’s files and take remedial steps to improve their credit score is significantly limited. Furthermore, credit scoring models are based on class attributes and underlying assumptions built into them by the developers, which remain undisclosed. Additionally, failure to use Inclusive datasets to facilitate the machine learning process may amplify and perpetuate existing social inequalities.

In order to achieve meaningful access to financial services using alternative data, industry stakeholders need more guidance and regulations on data privacy, transparency and clarification of the scope of data that may be used for credit scoring purposes. Currently, laws and regulations prohibit the use of certain information such as race, religion, caste etc. that may result in lending discrimination. However, it’s unclear to what extent these regulations capture alternative data, as discrimination may occur indirectly or even in an unintended way depending on the type of data being used and how the scoring model works. This also increases the risk of vulnerable populations, such as the elderly in financial distress, being targeted for predatory marketing pitches to sell financial products with unfavorable terms.

After all, without ensuring data integrity, it’s not realistic to expect alternative data to miraculously expand access to credit. In the absence of sound and uniform regulations on data privacy and laws on the fair use of alternative data, consumer protection risks outweigh benefits.

What happens without regulation?

Increased data security risks

A data protection bill currently under consideration in the Parliament of India, if passed, would mandate that companies report data breaches and take “appropriate remedial action”.3 It is possible that such regulation could incentivize companies to invest in cybersecurity, but ultimately the companies’ behavior would be driven largely by the effectiveness of enforcement.

This lack of regulation could have profound implications, however: while a data security breach in a credit bureau might result in of individuals’ financial data being leaked, a breach of an alternative lender’s database could result in the leak of far more sensitive details, including users’ location histories and phone contacts. Worse still, even when individuals are not approved for a loan, if they allowed a digital lending app to access their phone data at the time of application, their personal data may continue to be stored by the digital lending platform, putting their data at risk.

Discrimination

Alternative lending applications may afford consumers greater access to formal credit, but they provide weaker consumer rights than do traditional lending institutions, such as banks.

By using data points based on race, ethnicity, and gender, digital lending apps and other alternative lending platforms could easily discriminate against members of groups without just cause. While proponents argue that alternative lending is better than the status quo — and so should be supported in developing countries regardless of these concerns — the potential for discriminatory practices remains high.

The use of such algorithms to determine creditworthiness also raises questions. Through these apps, financial inclusion is more likely to become a group-based outcome that averages individual, and sub-group, differences. Alternative lending can increase financial inclusion, while simultaneously discriminating against specific groups and individuals. Because these algorithms are based on categories of data, alternative lending has potential to widen the disparity in access to credit between groups while making all individuals better off in absolute terms. As a trivial example, consider a lending platform that assesses creditworthiness based on the number of contacts stored on a person’s smartphone. Such an algorithm would likely determine men to be more creditworthy in countries like India, where men have greater social mobility (and likely more phone contacts) than women for socio-cultural reasons. Consequently, women (determined to be higher-risk individuals) would face higher interest rates on the same loan amount.

How do we find a solution?

Transparency is a tool that can help police against discrimination. If users of alternative information disclose the sources of such data, the data subjects may be able to assess the validity of such information and how it is being used. In addition, a requirement that users of alternative data articulate non-discriminatory grounds for decisions can also serve as a check against discrimination.

While offering many benefits, the use of new types of alternative customer data for financial and other sensitive decisions also raises significant data protection and privacy concerns including revealing confidential information to third parties, enabling aggressive marketing practices, and security risks including fraud and identity theft. A key component of data protection is data security. With a greater reliance on electronic communications and interconnected transactions comes the risk that cybercriminals could hack into information systems, disrupting them and potentially stealing data. This increased vulnerability to fraud and disruption could negatively impact access to financial services in the future.

At the end of the day, policy makers face the challenge of striking the right balance between promoting the benefits of the expanded use of alternative data while ensuring adequate data protection and attention to consumer privacy across the eco-system.

Disclaimer: The views expressed in the article above are those of the authors' and do not necessarily represent or reflect the views of this publishing house. Unless otherwise noted, the author is writing in his/her personal capacity. They are not intended and should not be thought to represent official ideas, attitudes, or policies of any agency or institution.


Ajaya Kumar Sahoo

Reserve Bank of India as a senior officer in various supervision departments and presently the Chief Operating Officer, Fincfriends Private Limited and Independent Director at PC Financial Services Private Limited

More From The Author >>

Aashrit Varma

Consultant in Bridge Policy Think Tank and regularly works on issues involving Technology, Data Protection and Privacy and Mediation.

More From The Author >>