The Dangers Posed by Behavioral Economics

Last week, Richard Thaler won the Nobel Prize in economic sciences for his pioneering work in the field of behavioral economics. His research applies insights from a different field — psychology — by focusing on various “cognitive biases” that explain how people’s behavior deviates from that of the purely rational beings in economic models.

Dr. Thaler’s contributions to the field of economics should be celebrated. However, the value of behavioral economics in a public policy context is more nuanced. There are some clear benefits, but there are dangers as well.

An example of a cognitive bias is the “endowment effect.” After walking past a furniture shop, you might be willing to pay up to $100 for the table you see in the shop window. But after you’ve taken it home for a week, you refuse to sell the table to your neighbor for anything less than $200. In other words, ownership sometimes changes the value that people place on items. Such behavior seems to defy what traditional economic models would predict, which is that an item’s usefulness to a person is what matters, rather than who owns it.

Countless quirks like this have been identified by researchers, and governments are beginning to take advantage of these quirks when designing policies. One policy application of behavioral economics is to make enrollment into certain programs, such as organ donation, automatic. People can always opt out if they want to, but if people are enrolled by default, more tend to participate.

In response to such findings, former President Obama issued an executive orderin 2015 encouraging government agencies to look for ways to incorporate behavioral science insights into their policies and programs. He created a Social and Behavioral Science Team with a similar mission. 

Alongside some positive policy developments, however, there are warning signs. Behavioral economists such as Dr. Thaler are quick to criticize the assumption that people act rationally. But, ironically, many of these same scholars hold up perfect rationality as an ideal against which real-life human decisions should be judged. Any deviation from perfect rationality, the logic goes, is grounds for government regulation. It would seem to follow that any decision people make could justify regulation. 

Consider a bachelor deciding whether to propose to his long-time girlfriend. Several cognitive biases will make it more likely that he proposes, even if it is a poor decision. The bachelor might be overly attached to his girlfriend due to the endowment effect. He might be too optimistic about the prospects of marriage, given that so many marriages end in divorce. Or, he might choose to marry his girlfriend out of sheer inertia, when in fact he should be exploring his options. 

But we should not forget about the cognitive biases that may discourage the bachelor from proposing, even if marriage is actually in his best interests. He might remember his own parents’ divorce, a salient event in his life clouding his judgment. He might place too much weight on what he loses in marriage, such as the freedom to date other women, and downplay the gains from having a life partner and perhaps also a family. Or, he might simply procrastinate and put off proposing, putting his relationship at risk and postponing the benefits of marriage. 

Whatever choice the bachelor makes, one can cherry pick from a list of more than 180 cognitive biases to argue that it was a bad decision.

Sound like a silly example? Perhaps, but regulators and academics are already criticizing the public’s decisions in a similar manner. Behavioral rationales have been put forth to justify the regulation of everything from payday loans to sugary drinks and even home appliances. 

Behavioralists often point to “present bias” — putting too much emphases on the present relative to the future — in such cases. In the context of payday loans, borrowers might later regret agreeing to a high interest rate. With sugary drinks, Coca Cola drinkers might downplay the health repercussions of obesity. With home appliances, maybe consumers should be willing to pay more upfront for energy efficient appliances if in the long run they save money through lower utility bills.

These arguments may have some validity. But which behavioral biases are being ignored in these cases? Perhaps some people have an irrational aversion to debt. Perhaps some are overconfident that future energy prices will be high. Cognitive bias is real, but no regulator has all of the pertinent information about peoples’ purchasing decisions. 

Take, for example, recent regulations setting energy and fuel efficiency standards for appliances and automobiles. Energy usage is certainly a legitimate concern, but according to the federal government’s own analyses, these rules will produce negligible environmental benefits for Americans. They are justified instead primarily because they “correct” the supposed irrationality of U.S. consumers and businesses.

President Trump would be wise to set boundaries on the government’s use of behavioral economics. President Obama’s executive order failed to make the important distinction between using behavioral science insights to address real problems in the marketplace — such as a shortage of organs available for transplants — and using behavioral findings to correct supposed cognitive biases found in the public. The president could repeal this order, or he could ask the Office of Management and Budget to issue guidance that clarifies this distinction.

Behavioral economics has made valuable contributions to the social sciences, which is why Richard Thaler’s Nobel is well-earned. But this research also poses dangers, depending on how it is applied. The government should put reasonable limits on the use of behavioral economics to help prevent the abuses that will inevitably occur in the hands of overzealous regulators.