Potential Benefits and Risks of the Increased Use of Data in Financial Services Applications

Testimony before the Senate Committee on Banking, Housing, and Urban Affairs

Good morning, Chairman Crapo, Ranking Member Brown, and members of the committee. I thank you for inviting me to testify.

My name is Brian Knight, and I am the director of the Innovation and Governance Program and a senior research fellow at the Mercatus Center at George Mason University. My research focuses primarily on the role technological innovation plays in financial services. Any statements I make reflect only my opinion and do not necessarily reflect the opinions of the Mercatus Center or my colleagues.

I would like to begin by thanking Chairman Crapo and Ranking Member Brown for their leadership in holding this hearing. The role of financial technology (or “fintech”) in changing the market for financial services is continuing to grow, with innovations permeating all financial markets. The importance of these technological changes is reflected by the fact that the Treasury Department chose to devote almost an entire report to the topic in its series of reports on core principles in financial regulation.[1] I also appreciate your collecting speakers from a broad array of experiences and viewpoints for what I expect will be a productive discussion. I am honored to be part of it.

Given the limited amount of time, I have focused my testimony on a handful of areas centered on the collection, aggregation, and use of data. I am happy, however, to answer any other questions you may have to the best of my ability.

I want to leave you with three main points:

  1. Fintech innovation has significant potential to improve the quality of, and access to, financial services.
  2. While there are potential risks, these risks should be judged against the status quo, not an unobtainable perfection.
  3. Existing law can mitigate risk to some degree, and changes to the law should be considered only if existing law is proven to be inadequate and the benefits of changing the law will outweigh the costs.

THE POTENTIAL FOR A BETTER FINANCIAL SERVICES MARKET

Changes in technology have the potential to improve the financial services markets. Specifically, the collection, use, and aggregation of consumer data may allow consumers to enjoy more choice, more competition, and higher-quality services. Likewise, the use of artificial intelligence, machine learning, and other advanced algorithmic techniques to process data present the possibility of more accurate, fair, and inclusive underwriting and risk management.

While there are reasons to be excited, there are also potential risks. More granular data collection and broader access might increase the risk and harm of data breaches to consumers. There are concerns that the enhanced use of algorithms may lead to more discrimination, a lack of transparency, or diminished access to essential services like credit.[2] There are also fears that the existing legal and regulatory environment is unable to address the risks introduced by technology.

While these concerns merit consideration and the risks they describe should be monitored, it is premature to panic. First, the early data are promising, in many cases finding that financial technology and the competition and innovation it fosters are improving financial services. Second, existing law and regulation might mitigate some of the major risks already. Although this area is often presented as a lawless Wild West, it is incorrect to think that these areas are unregulated. As discussed below, existing regulations apply, and in general, we should see how well the existing laws and regulations work with new technology before we impose new restrictions. Indeed, we should consider the possibility that, in fact, we already have too much regulation that affects these new technologies. Otherwise we risk forestalling innovations that can lead to more competitive, efficient, and inclusive financial markets—to the detriment of the American consumer.

Data Collection

As the Treasury Report notes, the ability of financial service providers to collect and utilize a broader and more diverse selection of consumer data has the potential to improve the provision of financial services, especially to consumers who are poorly served by the status quo.[3] Not only could cost-effective access to more data help established firms improve their offerings, it could also encourage competition and innovation from new entrants.

While the ability to access and utilize more data has a significant upside, it also presents risks. For example, it is possible that the more granular a dataset a financial institution collects on a consumer, the more harm a security breach could cause. Data that might be relatively harmless at one level of detail could become highly sensitive at another. What could be labeled “professional or medical services” at one level of detail could be labeled “marriage counseling” at another. While obtaining more information could allow financial services providers to offer better products, we should also be alert to the risks that could develop.

Additionally, as the Treasury Department notes, there are divergent regulations at the state level regarding data security and breach notification.[4] These different requirements can increase compliance costs for firms and result in citizens being regulated by sets of rules put in place without consultation with them, the consumers.[5] Given the predominantly interstate nature of cybersecurity, there is little question that Congress could constitutionally preempt state law to create consistent national standards, and given the costs of the status quo, it may want to consider doing so.

Data Aggregation

Third-party aggregators, acting on a consumer’s behalf, can now allow consumers to see all of their accounts from different financial services providers at a glance. This convenient display of information can help consumers more effectively assess and manage their finances. Third-party aggregation can also be used by applications, again acting at the request of the consumer, to collect the consumer’s financial data in order to allow the consumer to use the application’s service. Such applications are gaining in popularity; a recent survey conducted by the Clearing House found that about a third of banking customers use financial technology applications.[6]

While there are real potential benefits to data aggregation, the practice is not without controversy. Banks and other financial institutions have expressed concern that data aggregators, particularly those using “screen scraping,”[7] place consumers’ data at risk and potentially expose consumers to fraud and the bank to liability.[8] As the Treasury Department’s fintech report notes, the banks’ fears are not outlandish, as there is an open question as to the scope of the banks’ liability under existing law, even if the customer willingly granted access to a third party that was responsible for the data breach.[9]

This concern is part of why section 1033 of the Dodd-Frank Act is so controversial. As the Treasury Department report notes, there is a plausible reading of the act (one that the Treasury endorses) that requires financial institutions covered by Dodd-Frank to, subject to rules promulgated by the Bureau of Consumer Financial Protection (“Bureau”), make account records available in an electronic form not only to consumers themselves but also to a consumer’s agent, including a fintech application.[10] Paired with potential legal liability, this provides banks with few options to protect themselves.

Understandably, this presents some significant issues that the Bureau, and potentially Congress, should consider. Among them are the following:

  • The extent of the burden placed on covered financial institutions. Must a covered financial institution make data available to all comers, or may it place limits on the basis of safety or data security?
  • The standards for data transmission. As mentioned in the Treasury Report, there has been a shift from screen scraping to the use of application programming interfaces (APIs) that may provide a more secure method of communicating data. However, there is not a mandatory standard that would allow interoperability. While there are ongoing industry efforts to bring standardization,[11] questions remain as to whether covered financial institutions must accommodate all requests and who will set standards for data transmission methods.
  • The scope of data transmission. One of the major concerns expressed by covered financial institutions is that data aggregators can obtain data in excess of what is needed to perform the service the consumer has authorized them to do. Conversely, data aggregators express frustration that financial service providers prevent them from accessing needed data via financial-service-provider-approved APIs.[12] While the availability of more data may allow applications to offer better services, it could also increase consumer harm if there were a breach. The scope of data that aggregators will be able to obtain from financial institutions, and what factors control that scope, will need to be determined.
  • Consumer control of data transmission. The amount of control consumers will have over the amount of data that is obtained by aggregators, and how that control must be exercised, will need to be determined. According to the same survey by the Clearing House, a majority of consumers would like to be required to provide explicit consent to any third party seeking data.[13] However, what that might look like in practice (e.g., when that consent must be provided or how granular the consent must be), and whether that standard is even practical, remain to be determined.
  • Liability for data breaches. As the Treasury Report discusses, there is a question regarding the scope of liability for a financial institution in the event consumer data is lost owing to a failure on the part of a data aggregator or a downstream application. Financial institutions feel at risk that they will ultimately be forced to compensate customers, even if the financial institution was not at fault, because the aggregator or application lacks sufficient resources to make aggrieved customers whole. This concern is heightened if financial institutions are forced to make data available to aggregators, rather than choosing to enter into contracts that allow the financial institutions to perform due diligence and make demands of the aggregator.

If the Bureau adopts the Treasury Department’s view regarding section 1033, it will need to craft a rule that provides meaningful access while addressing the legitimate concerns of covered financial institutions. However, the Bureau should also leave as many of the details as possible to market participants so as to not impede innovation or risk enshrining requirements that will become outdated or suboptimal far faster than the regulatory process can adapt. Congress should monitor these developments to determine whether any subsequent adjustment is necessary.

Innovative Underwriting

As the Treasury Department notes, credit underwriting is one area where data, in conjunction with artificial intelligence, are being used to potentially great effect. There is optimism that algorithmic underwriting may increase inclusion and improve the quality of underwriting, making it more accurate and efficient. However, there are also concerns that it could exacerbate discrimination and exclusion, because the algorithms may exacerbate existing discrimination or be so opaque that humans lose the ability to discern what is driving the algorithm’s results, preventing humans from excluding improper variables.[14] These concerns are particularly acute with regard to unintentional discrimination through the use of facially neutral variables that nonetheless have a “disparate impact” on protected classes of persons.

While these concerns should be taken seriously, there are also reasons to believe they are at least somewhat overstated. First, it must be remembered that the appropriate standard to judge innovative underwriting is not perfection. Rather, we should judge whether it is an improvement over the status quo. In this regard, there is evidence that innovative underwriting may prove to be less discriminatory than current practices. Second, there are reasons to believe that the current legal and regulatory environment for financial services may be well situated to mitigate these risks.

As Professor Anupam Chander points out, there are several reasons why algorithms may prove to be less prone to discrimination than human decision-making. To the extent that discrimination is driven by subconscious or unconscious bias, those biases are less likely to survive the process of being written down in an intentional underwriting algorithm compared to a “gut decision” by a lending officer.[15] Additionally, to the extent there is concern that algorithms may present a “black box” that cannot be audited, they nonetheless present less of a black box than the human mind.[16] Further, to the extent human decision-making incorporates inaccurate stereotypes when making decisions, algorithms, with access to more and better data, and without the baggage of inaccurate stereotypes, may be able to do a better job.[17]

Early evidence of the use of innovative underwriting is promising. For example, researchers at the Federal Reserve Banks of Chicago and Philadelphia looked at a leading marketplace lender’s use of innovative underwriting and found that the lender was able to offer many borrowers better rates than they would have received from a traditional lender. These loans also seemed to age reasonably well, indicating that the underwriting did not present an undue risk of default.[18] Likewise, scholars at the University of California, Berkley, found evidence indicating that fintech lenders using innovative underwriting for mortgages were significantly less likely to discriminate on the basis of race than traditional lenders.[19] While we are still in the early days and more research is necessary, there are good indications that innovative underwriting, as applied, may have significant benefits.

Additionally, certain existing regulatory requirements may encourage firms developing innovative underwriting tools to avoid some of the concerns expressed by pessimists. For example, while there are concerns about the opacity of algorithms, the Equal Credit Opportunity Act and Fair Credit Reporting Act require lenders to be able to provide prospective borrowers with adverse action notifications explaining why the borrower was denied or charged a higher rate and detail the information the lender used to make that determination.[20] Complying with this requirement will be difficult if the lender’s algorithm is truly opaque, giving lenders an incentive to maintain auditability and explainability.[21]

Further, while lenders have an economic incentive to ensure that their algorithms are accurate and not irrational, there are also existing regulatory reasons to do so. To the extent that underwriting algorithms generate lending decisions that create the “artificial, arbitrary, and unnecessary barriers” that disparate impact theory is meant to address,[22] the lender may, depending on the unique circumstances and the relevant applicable statutes, also find itself subject to liability for lending decisions that, while relying on facially neutral criteria, have a disparate impact on protected classes of borrowers, unless those decisions are driven by a legitimate business purpose and cannot be accomplished with less discriminatory means. While lenders have a strong profit motive to make certain their underwriting is as accurate as possible, potential liability should also encourage lenders to actively monitor and improve their algorithms.

CONCLUSION

The advance of technology has shown significant promise for improving the market for financial services. Specifically, the collection, aggregation, and use of consumer data has significant potential to allow consumers to enjoy the benefits of a more competitive and innovative market. Of course, there is no such thing as a free lunch, and increased risks may accompany the benefits. However, at present there is no reason to panic, and rash regulatory intervention may frustrate pro-consumer innovation, leaving consumers worse off.

Congress should carefully monitor and evaluate developments in the fintech arena and intervene only when existing law and regulation—including market regulation—prove inadequate to address a problem and where the costs of intervening would not be worse than the problem the intervention seeks to solve. When Congress does intervene, it should do so in a technologically agnostic manner and refrain from imposing specific technical requirements on market participants because such solutions are likely to become obsolete in short order.

A specific area Congress may want to monitor is whether concerns about potential liability are chilling innovations in underwriting that might otherwise benefit society. Congress should consider tools such as “regulatory sandboxes,” which can allow firms to experiment in a way that encourages innovation while maintaining appropriate consumer protection. While some regulators have announced their intention to undertake such activities under their existing authority, given the fragmented nature of financial regulation, it may require Congress to provide sufficient authority to allow for meaningful experiments.

Another area Congress should consider is the question of whether the current allocation of regulatory authority regarding data security and breach notification is appropriate. As mentioned earlier, the laws governing data security and data breach notification, especially those at the state level, may be unduly burdening market participants and forcing consumers to pay for rules they had no say in. Therefore, Congress should consider whether establishing consistent, preemptive federal standards would be appropriate.

Technology presents the opportunity for market actors to more effectively gather, aggregate, and use data to provide customers with better, cheaper, and more effective financial services. While there are potential risks that should be monitored, there is also the potential for significant benefits. Intelligent regulatory choices, including the possibility of exercising forbearance, can help create an environment where consumers are able to enjoy the maximum benefits of innovation and competition while enjoying adequate protection.

Thank you again for the invitation to testify. I look forward to your questions.

Note: Mr. Knight also wrote answers to two questions for the record, which were submitted by Senator Heller and Senator Brown. Both answers are available in PDF format under "Key Materials." 

[1] Steven T. Mnuchin & Craig S. Phillips, U.S. Dep’t of the Treasury, A Financial System That Creates Economic Opportunities: Nonbank Financials, Fintech, and Innovation (2018) [hereinafter Treasury Report].

[2] See, e.g., U.S. Fed. Trade Comm’n, Big Data: A Tool for Inclusion or Exclusion 8–11 (2016) (summarizing findings of public workshop on big data regarding potential risks).

[3] Treasury Report, supra note 1, at 17.

[4] Treasury Report, supra note 1, at 39–41.

[5] For further discussion of the potential costs of state-by-state regulation on fintech, including the costs of inefficiency and political inequity among citizens of different states, please see Brian Knight, Federalism and Federalization on the Fintech Frontier, 20 Vand. J. Ent. & Tech. L. 129, 185–99 (2017).

[6] The Clearing House, Fintech Apps and Data Privacy: New Insights from Consumer Research 4 (2018).

[7] Screen scraping generally refers to an aggregator using a customer’s login credentials to log into a financial institution’s webpage on behalf of the customer and extracting data from the webpage.

[8] See, e.g., The Clearing House, Ensuring Consistent Consumer Protection for Data Security: Major Banks vs. Alternative Payment Providers (2015).

[9] Treasury Report, supra note 1, at 35–36.

[10] Treasury Report, supra note 1, at 31.

[11] See, e.g., NACHA, API Standardization - Shaping the Financial Services Industry (2018) (discussing efforts by NACHA to develop standards for financial services APIs to allow interoperability).

[12] Treasury Report, supra note 1, at 34.

[13] The Clearing House, supra note 8, at 7.

[14] Treasury Report, supra note 1, at 57–8.

[15] Anumpam Chander, The Racist Algorithm?, 115 Mich. L. Rev. 1023, 1028 (2017).

[16] Id. at 1030.

[17] Id.

[18] See Julapa Jagtiani & Catharine Lemieux, Fintech Lending: Financial Inclusion, Risk Pricing, and Alternative Information (Fed. Res. Bank of Phila., Working Paper No. 17-17, 2017); Julapa Jagtiani & Catharine Lemieux, The Roles of Alternative Data and Machine Learning in Fintech Lending: Evidence from the Lending Club Consumer Platform (Fed. Res. Bank of Phila., Working Paper No. 18-15, 2018).

[19] See Robert P. Bartlett, Adair Morse, Richard Stanton & Nancy Wallace, Consumer Lending Discrimination in the Fintech Era (2018).

[20] Matthew Bruckner, The Promise and Perils of Algorithmic Lenders’ Use of Big Data, 93 Chicago-Kent L. R. 1, 38–39, 51 (2018).

[21] Id. at 40.

[22] Tex. Dep’t of Hous. & Cmty. Affairs v. Inclusive Cmtys. Project, Inc., 135 S. Ct. 2507, 2522 (2015).