States Should Think Twice About Data Privacy Policy Making

California Governor Gavin Newsom recently made waves with his State of the State address idea for “digital dividends.” Taking the “data is the new oil” trope to its literal extreme, this policy that would require companies to “share the wealth that is created” from user data, similar to dividends that residents of certain areas might receive for natural resources.

This is just the latest state-level proposal to regulate data use. But do these state rules actually protect their citizens, or might they create more problems than they solve?

Why Are States Addressing Data Privacy?

Data privacy has been a hot topic of debate over the last year or so due to regulatory actions like Europe’s General Data Protection Regulation (GDPR) and incidents like the Cambridge Analytica scandal. Amid the growing chorus of “something must be done,” federal, state, and even city governments have proposed a number of responses.

Some states, in particular, aren’t waiting for the federal government to act on privacy.

Last summer, California passed the California Consumer Privacy Act (CCPA), which addresses data use by certain companies and users inside and outside of state borders. This GDPR-style regulation would require companies to disclose data practices, delete information on demand, and allow users to opt out of certain data uses, with civil penalties for deviation.

Other states, like Mississippi and Massachusetts, are considering their own data privacy bills modeled on the restrictive approach of California and Europe. Illinois, Texas, and Washington previously enacted legislation governing user biometric data like fingerprints or iris scans.

As is generally the case for fast-moving technology, concerns and innovation both outpace policymaking. This “pacing problem” is often more pronounced at a federal level, particularly when it comes to legislation. Some government agencies have used “soft law” approaches to provide more adaptive governance frameworks for rapidly advancing technologies. But in other cases, more nimble state governments have been able to enact policy impacting these disruptive innovations—for better and for worse. In the case of data privacy, it is more likely that the creation of a patchwork of state privacy laws would be the latter.

The State of State Data Privacy Proposals in 2019

Many state proposals mirror the CCPA that passed last year, but other more novel policies are on the table. Some are even bolder than California’s already-controversial rules.

For example, Hawaii’s data privacy proposal does not have a carve-out for any business based on size, meaning that a small entrepreneurial start-up would have to comply with the same set of requirements that a large, established business would. New Mexico is also considering a broad data privacy proposal modeled after the CCPA; its sponsor stated that it is intended “to throw as much spaghetti on the wall” as possible.

Other policy ideas target specific concerns.

For example, many proposals would raise the age of consent for data collection from 13 to 16. While such changes are well-intentioned, they remove choices from families and can even impact users who are now adults.

Young teenagers have a lot to gain from data-driven platforms; they can keep in touch with far-off family, watch videos on YouTube to help with class, or even help Mom and Dad with research on a home project or travel plans. It’s not surprising, then, that many parents have encouraged and helped their children to get around existing age requirements for social networking and other data-driven websites.

Does government know what’s better for children than their own parents? Should these arrangements be legally proscribed? While raising age requirements may be well-intentioned on the part of the regulators proposing such changes, heightened requirements can make it more likely that such violations occur, particularly for families that rely on online services for certain products.

For example, the encrypted messaging service WhatsApp was forced to raise its age requirements in the EU because of the GDPR. As a result, families who want their teenage children to have their own account either have to lie about the child’s age or seek out costly alternatives like international calling.

Adults who may have otherwise used products at a younger age are also affected. Twitter blocked the accounts of many young adult users who were suspected of being 13 or younger when they initially joined because of GDPR requirements. Newer companies who listed their “birthday” as their founding date also got the axe.

While each proposal is unique and some have more clear consequences than others, any state law governing data privacy would contribute to a growing patchwork that could become a compliance quagmire for new innovators.

The Problems with State Data Privacy Law

The internet by its nature transcends traditional borders. While this can be a powerful force for bringing together ideas and people who may never have interacted, it also means that state attempts to regulate aspects of the internet can have an impact far beyond their borders.

When states adopt data privacy laws, they create additional barriers that can limit innovation and entrepreneurship. This is true not only for global tech giants but for local, small businesses for which an online platform could provide a springboard to success.

As Andrea O’Sullivan and Adam Thierer have pointed out, a single state law could impact the ecosystem of the digital economy much like a tariff. Each new state data privacy law can act as a barrier that raises the costs of expanding to that state. For small businesses, the cost of complying with numerous requirements each time a product expands to a new state can deter innovation or expansion. As a result, such regulations can often deter new entrants and further calcify the market power of existing giants.

With fewer choices available, state data privacy laws could potentially undermine consumer welfare by limiting better or more innovative options.

But the consequences of state data privacy rules do not just impact business decisions, they also limit what’s available to consumers. With fewer choices available, state data privacy laws could potentially undermine consumer welfare by limiting better or more innovative options.

Vague and broad requirements can erode consumer privacy. For example, as law professor Eric Goldman wrote, the CCPA could be interpreted to require businesses to publicly disclose certain consumer data, or to collect more consumer data, or to store such data in more easily identifiable ways to handle consumer requests. When 1700 voice recordings were sent to the wrong person after Amazon received a GDPR request, neither recipient nor the requester likely felt like their data were more protected by this set of stringent requirements.

Even more limited laws can eliminate choices for consumers in unexpected ways. For example, many people think of Illinois’s Biometric Information Privacy Act as a rule that regulates the use of things like fingerprints and iris scans, but consumers may have been surprised and disappointed to learn that BIPA prohibited them from partaking in Google’s Arts & Culture match or certain Facebook photo features.

What States Should Consider Doing Regarding Data Privacy

It is clear that states feel that they should do something because of citizen concerns. But many of these concerns are the result of widely varied preferences, so policymakers at all levels of government should think carefully before enacting a one-size-fits-all policy that could eliminate choices for consumers and opportunities for innovators and entrepreneurs.

Rather than regulating away choices, states could educate consumers about the ways they can protect their privacy. As Andrea O’Sullivan notes, many data privacy proposals do not address the fundamental knowledge gap between data standards and individual practices.

Before passing any new regulations, states should determine whether existing regulations are adequate. All fifty states have a law regarding notification for data breaches. While the requirements and remedies associated with the laws vary, educating consumers about these notice requirements and what happens in such events could assuage many concerns behind the headlines.

Finally, we should recognize that different choices regarding data privacy do not mean a lack of awareness of the risks. Studies have shown that many consumers, particularly heavy users, are aware that personal data are collected in exchange for services. In fact, one of the benefits of the permissionless approach is that it allows individuals to make choices that reflect their own preferences, rather than forcing the most restrictive preferences on everyone. So while one person may deploy ad-blockers and opt out of targeted ads, another can choose to allow a company to scrape their data in exchange for products or a more personalized experience. Rather than restricting the ways in which consumers and services interact, we should customize our internet experiences to match our personal privacy preferences.

State laws governing data privacy could undermine the very freedom that has made the United States the foremost global innovator on the internet. A patchwork of state regulation would institute a more limiting, highly-regulated environment based on the policy choices of a few states. Unfortunately, when states pursue data privacy policies, they often create more problems than solutions.