How to Promote Data Privacy While Protecting Innovation

Our digital age spawns marvels and mayhem. The considerable number of news stories extolling a cool new gadget or online service may only slightly outpace the number of aghast exposés on data privacy issues.

Genetic testing services provide a timely case in point.

It’s hard to turn on a screen these days without seeing an advertisement for an at-home heritage and health analysis of your DNA. 23andme, a pioneer in these services, recently fine-tuned its ethnicity estimates to more specific regions, allowing even more insight into our ancestries.

Private genetic testing services are not without their dark side. Earlier this month, FamilyTreeDNA admitted to its customers that the company had been sharing DNA data with law enforcement without user permission. Police have been able to use genetic data from FamilyTreeDNA and the open database GEDmatch to nab criminals decades after a case has gone cold.

These are obviously great developments for grieving families seeking justice. But such incidents also raise questions about the ethics and legality of third-party access to genetic data which can be linked to a person without their submission or consent.

Genetic testing services constitute just a small part of our digital landscape, and therefore a mere sliver of the broader data privacy questions. How can company behavior be aligned with user privacy desires? What remedial measures do or should consumers have against data malfeasance by companies? What role should regulation play? What should we do when one person’s definition of “privacy” contradicts another’s conception of “consumer choice”?

Mercatus scholars have developed a body of research answering these questions and more.

Whose Rights Are Right?

References to our “rights to privacy” are often taken for granted in discussions of platform malfeasance. In Europe, this may be more warranted, as rights to privacy from private entities are often enshrined in law (which can create logistical problems and economic distortions, as we’ll later discuss).

In the US, the legal right to privacy is generally understood in relation to government bodies, as an outflow from our protections against things like unreasonable search and seizure and self-incrimination. When commentators try to apply these rights to private bodies, as is common in many privacy debates, conceptual problems emerge.

The internet has dramatically decreased the cost of accessing information. This is generally regarded as a good thing, and most people are wary of “censoring” the internet. Yet government-erected barriers to what private entities can do with legitimately received information are effectively a call for censorship.

In other words, one private entity’s “right to privacy” could be another private entity’s “obligation to self-censor.”

This principle is further complicated by the fact that no one can agree on what “privacy” means. Mercatus senior research fellow Adam Thierer has pointed out that even seasoned privacy scholars struggle to accept a single concrete definition, characterizing it instead as a “conceptual jungle” that is in “disarray.”

This is easy to observe in our everyday lives. Some people may be delighted by facial recognition software that makes it easier for them to sort and recall personal photographs. Others may be appalled at what they feel is a grave violation of privacy. A rule that preemptively bans such software would violate the first person’s rights to freely choose that service. An environment that affords affected individuals no recourse would prohibit the second person from opting out of such a service.

So the challenge of protecting privacy is necessarily multifaceted and therefore tricky. Policies that approach privacy from a one-size-fits-all perspective are inadequate to address the needs of our diverse population.

The Problem with Precautionary Privacy Regulation

The one-size-fits-all regulations usually proffered to resolve privacy matters are based on the precautionary principle. Adherents to this policy perspective advocate preemptively curtailing or banning new uses of technology until suspected harms are proven false or neutralized.

In the case of privacy regulations, this can mean banning or fining certain information uses, requiring that private entities cease the use of information on demand, or mandating that private entities seek permission before rolling out new uses of stored information.

Thierer discusses the precautionary tendency in proposed and enacted privacy legislation in a Maine Law Review article titled “Privacy Law’s Precautionary Principle Problem.” He writes:

“The problem with letting such precautionary thinking guide policy is that it poses a serious threat to technological progress, economic entrepreneurialism, social adaptation, and long-run prosperity. If public policy is guided at every turn by the precautionary principle, technological innovation is impossible because of fear of the unknown; hypothetical worst-case scenarios trump most other considerations.”

If we had allowed precautionary thinking to guide the development of the internet, it is very probable that it would not have been allowed to take off at all.

After all, bad things happen on the internet. People get hacked and scammed every day, or worse. Tech critics could have used these same pretexts to stop or limit nascent networking technology to the original handful of academic and military bodies two decades ago.

Fortunately, the early internet was not stifled by precautionary regulations. Rather, it was protected and fostered by the Clinton administration’s “permissionless” Framework for Global Electronic Commerce, which stipulated that “the private sector should lead [and] the internet should develop as a market-driven arena, not a regulated industry.”

Applying the precautionary principle to digital commerce seems absurd today. But this is because, with the benefit of hindsight, we can easily see the benefits of open online exchange. Many people wonder how they lived before it.

The allure of the precautionary principle is much stronger for technologies that are still young and going through growing pains. It is precisely in these situations that we must resist such knee-jerk prohibitions.

Case Study: the General Data Protection Regulation (GDPR)

Social media and search platforms, in particular, are often the objects of scorn—many times for good reason. The European Union’s expansive GDPR policies were enacted specifically to rein in what leaders saw as excesses by mostly American technology companies.

The EU may have hoped the GDPR would stand as an example of strong, straightforward pro-privacy legislation. Instead, it demonstrates the many unintended and counterproductive consequences that sweeping, poorly-worded policies can promulgate.

The 88-page document requires that “all companies processing the personal data of subjects residing in the Union” follow specific breach notification, "right to be forgotten," data portability, and privacy and security policies.

Yet much of the language is vague or unspecified. For example, the definition of “personal data” is broad enough to possibly include any blog with comments from EU citizens. Even many of the EU bodies tasked with GDPR enforcement admit that they don’t really know what “compliance” looks like.

Despite this vagueness, the punishment for deviation is high: fines could reach €20 million or four percent of global revenues, whichever is higher.

This policy environment has effectively created a precautionary environment where much innovation is simply too risky for entrepreneurs to undertake. They do not have a clear idea of what is allowed and what isn’t, and the costs of “violations” are draconian. So innovators simply do not bother in Europe. Since the EU is such a large market, it may have negative spillover effects around the world.

It is not surprising that so many small online ventures and media outlets simply decided to shutter their doors rather than risk the GDPR hammer.

Ironically, the outlets most able to handle GDPR contortions are the tech titans that the EU wished to bring to heel. The Facebooks and Googles of the world have ample funds and lawyers to navigate regulatory complexity. The would-be Davids that could topple these Goliaths do not, and they are snuffed out before they have a chance to try.

The GDPR has thus served as a barrier to entry in technology, which means more market share for Facebook and Google. This is how the law earned its nickname: the Google Data Protection Regulation. The first day after the GDPR took effect, Google raked in 95 percent of all EU ad money.

A Better Role for Government: Education, Investigation, and Redress

A regulation that ends up strengthening the entities it intended to discipline cannot be said to “protect privacy” in a meaningful way. This is not to say that government has no role to play, or that we will always be at the mercy of powerful corporations.

The challenge is to utilize government in a way that does not empower entrenched interests while providing useful information and redress for consumers.

In the United States, the Federal Trade Commission is the top federal cop on the privacy beat. The agency evolved into our nation’s primary data security watchdog by utilizing its Section Five authorities to protect consumers from deceptive or unfair acts or practices.

In a public comment to the FTC, Thierer, Christopher Koopman, Jennifer Huddleston Skees and I outline a flexible path for the FTC to best promote good data practices while leaving ample space for innovation.

Rather than undertaking a futile attempt to proactively outline “theories of everything” and proscribe specific “informational injuries” that can dampen innovation, as the EU has, the FTC and similar bodies should take a page from common law, which is adequately adaptative to the needs of the digital economy.

Many of the data problems over which technology commentators agonize have well-developed bodies of analogous precedent in common law. For example, our comment discusses how concerns over “datavaillance” have a common law analog in applying intrusion into voluntary seclusion. Adapting the wisdom of centuries of common law to new iterations of old problems is a much cleaner and more flexible approach. It is also far less clumsy than the awkward hammer of sweeping legislation like the GDPR, and easier to iterate and tweak as needed.

The FTC is also empowered to investigate and punish bodies that are found to violate standards against deceptive or unfair acts or practices. This approach provides consumers with an avenue for redress, without the threat of stifling innovation because of excessive punitive risks.

Government bodies also have important roles to play in educating consumers. Many of the “harms” discussed in the data privacy and security communities do not necessarily result from deception or unfair acts by firms. Rather, these harms may be the result of asymmetric information between firms and consumers, or a general lack of good data hygiene education.

This kind of knowledge gap is not addressed by legislation like the GDPR. But improving people’s understanding of general data standards and good data practices can improve behaviors and outcomes. Many agencies in the US put out educational backgrounders and alerts to help bridge this gap. These outreach efforts will only become more valuable as more of our lives are spent online.

The Bottom Line

Data privacy and security is a tricky and valuable goal to attain. Yet we must take care that our policies do not quash the technologies that will make our lives better. Even worse, bad regulations can inadvertently empower the bad actors that they seek to bring in line.

This does not mean we must throw up our hands and live with the status quo. Rather, the challenge is to design an adaptive policy environment that addresses exploitation and deception while leaving ample space for competitors and innovation. Common law is one attractive remedy. Governments also have important roles to play in educating firms and consumers on best practices.

Privacy and innovation need not be at odds. They are essential to a future that provides both protection and growth.

Photo credit: Chip Somodevilla/Getty Images, Senate Commerce Committee Hearing On Consumer Data Privacy October