The State of State Data Laws, Part 3: Biometric Privacy Laws and Facial Recognition Bans

Today, you can clock into an office with your fingerprints or unlock your phone with just your face. As such potential applications biometric technologies expand, so do the concerns about how they can be abused, particularly when it comes to privacy.

While federal policy is still at the discussion stage about placing limits on biometric technologies like facial recognition, some state and local governments have already passed rules for biometric data or applications, and others are considering it. Illinois, Washington, and Texas all have existing laws governing privacy rights for biometric information and its use by private entities. Other states, such as Montana, have considered creating similar legislation in the most recent legislative term.

Biometric information is often thought of as particularly sensitive and difficult, if not impossible, to change. Those in favor of regulating such technology often point to the ways in which it could be abused, and call for bans or restrictions on its deployment by certain actors.

However, current biometric privacy laws show that a broad approach to regulating this technology in the name of privacy may have unintended consequences and could remove beneficial uses of the technology as well.

Current state biometric privacy laws have prevented residents from accessing the benefits of certain technologies available in other states. For example, Illinois and Texas residents were unable to use Google Arts & Culture’s “art twin” match. While this may not seem like a real inconvenience, it illustrates how strict interpretation and legal challenge could remove consumers’ choice to use benign or beneficial technologies.

Similar applications could help users easily confirm their identity or improve outdated security or attendance systems—that is, unless they live in Illinois or Texas. Innovators may skip those states altogether in favor of a more welcoming regulatory environment.

Discussions of biometric and genetic data privacy can easily conjure dystopian science fiction visions like the film Minority Report. However, focusing on the fears associated with this type of data can lead one to neglect the benefits of these technologies. While critics focus on the potential abuse of this information, technology also has promising private sector applications that could be beneficial as smart door locks that would not require a fumbling for a key or quicker boarding of an airplane.

As with most privacy concerns, consumers have a wide range of preferences when it comes to balancing privacy and the potential benefits of biometrics. In private interactions, different consumers may make different choices including which companies they are willing to provide information or when allowing access to this data is worth the benefits.

My Mercatus colleague Andrea O’Sullivan discussed how the company Clear tries to provide speedier airport security using biometrics. She observed that some individuals may find the tradeoffs of a more rapid security clearance using biometrics beneficial while others are more privacy-sensitive about giving a private company such information.

Yet, broad laws requiring increasingly formal consent can prevent consumers and businesses from having  options and making  decisions themselves and innovators may find that it is easier not to offer their product in those states rather than engage in costly compliance.

Recently in Illinois, a court found that it was not necessary under the biometric privacy statute to prove harm had occurred for a lawsuit to proceed against Six Flags for collecting fingerprints from a minor for an annual pass without a parent’s consent. Such private rights of action increase liability concerns and could further raise legal costs, particularly for smaller companies.

There are legitimate concerns about the about biometric technologies. Some fear that the state could abuse facial recognition capabilities to further surveillance goals, for instance. This threat is especially acute considering that it has already in some countries with totalitarian regimes.

But rather than all-out bans, we should adopt a more targeted approach to address concerns about specific applications that are more prone to abuse. This would avoid many of the unintended consequences discussed earlier.

For example, as Cato scholar Matthew Feeney suggests, governments could consider placing restrictions on things that are the most likely to cause harms to civil liberties or allow state surveillance, like real-time facial recognition tracking. This approach would allow state and local governments to utilize beneficial applications, such as finding missing children with facial recognition, while minimizing potentially abusive applications.

While laws targeting biometrics may appeal to those envisioning a dystopian future, these policies bring adverse spillover effects onto benign applications. As a result, policy makers should carefully consider what limitations on biometric technologies are needed and tailor their policies to address specific concerns.

Photo credit: ERIC PIERMONT/AFP/Getty Images