A core purpose of a cost-benefit analysis in the regulatory process is, quite plainly, to avoid overly costly rules. But such analyses can also validate regulatory policy. That appears to be the case with proposals to toughen the leverage ratio.
Determining whether regulations truly solve a problem as intended is notoriously difficult. But when regulators are required to do a cost-benefit analysis in concert with a pending regulatory change, it can help ensure that the regulation’s benefits at least cover, if not exceed, the costs of the rule.
In a new research paper, we broadly apply the framework developed by staff at the Bank of England to measure the costs and benefits of having banks operate with relatively more equity and less debt. Specifically, we examine the costs and benefits of increasing the equity-to-asset leverage ratio for U.S. banks from 4%, which was the 2014 standard, to 15%.
We chose 15% because a recent study shows that under certain conditions, a 15% leverage ratio can be optimal. A variant of this standard was also proposed in a 2013 bill by Sen. Sherrod Brown, D-Ohio, and then-Sen. David Vitter, R-La. The leverage ratio we selected also exceeds the 10% ratio proposed in the Financial Choice Act, although the Choice Act — which is forming the basis of efforts to roll back the Dodd-Frank Act — allows banks to choose a tough leverage ratio requirement in exchange for regulatory relief.
The benefits of a higher leverage ratio include a lower probability of a banking crisis, since the more capitalized a bank is the more likely it can withstand large shocks that lower the value of its assets. The costs arise from the fact that equity capital may be more expensive to the financial institution than debt. Those costs mean that banks may pass higher funding costs on to borrowers, which would result in less capital formation and a lower real gross domestic product. However, we find that in almost all cases, the benefits of a more stable financial system outweigh the costs.
To estimate the benefits, we rely on annual data from 1892 to 2014 on U.S. banking crises and the aggregate capital-to-asset ratio for the entire U.S. banking system. Specifically, we find that increases in the capital-to-asset leverage ratio are associated with declines in the probability of a banking crisis.
Other factors also influence the estimated benefits. For instance, we find that crises have occurred frequently in U.S. history. Yet on average they have a shorter duration and result in a lower cost-per-crisis, equal to about a 4.5% reduction in annual GDP. Our finding of a 4.5% reduction in the U.S. in contrast to the Bank of England study’s finding of a 10% reduction in a broader set of countries implies that the benefits of eliminating banking crises may be lower in the U.S. for any given increase in the capital-to-asset leverage ratio.
To estimate the costs, we use quarterly data for all U.S. bank holding companies with at least $1 billion in total assets between 1996 and 2014 to show how increasing the leverage ratio changes the banks’ weighted average cost of capital. The pass-through from a higher bank capital cost to borrowers depends, among other things, on whether a bank can deduct the interest it pays on debt from its taxes, as well as on the portion of corporate funding that comes from bank loans.
By varying the assumptions, we get 256 different optimal capital ratios, each based on a unique combination of assumed parameter values. We find that in 97% of the cases, the benefits at least cover the costs. The only cases when this does not hold is when we make extremely high cost assumptions and correspondingly low benefit assumptions.
The “too big to fail” problem has been a long time in the making. Our findings suggest that replacing the current capital adequacy standards with a simpler, higher leverage ratio shows promise for ending “too big to fail.”