Technological “Governance” Requires a Balanced Approach

Technological governance comes in many flavors, and some are more appetizing than others. Generally speaking, flexible and adaptive governance systems are the smart way to address concerns about fast-paced technological development without derailing entrepreneurialism and innovation.

“Hard” regulation is one form of technological governance and includes everything from calls to break up large tech firms to efforts to pass a law governing autonomous vehicles. In such instances, governance is a black or white issue where there are either binding, top-down rules or nothing. This approach ignores the broad spectrum of governing mechanisms, sometimes referred to as “soft law,” that exist today. Soft law approaches include multistakeholder arrangements, non-binding guidance standards, agency workshops and reports, informal consultants, and third party governance.

This essay focuses on the role third party actors can and do play in the governance of emerging technologies, with an eye toward how those players and mechanisms are already being used to address concerns about artificial intelligence and robotics.

The private sector can use a range of tools and techniques in service of this goal, including certifications, best practices, ethical codes, industry standards, and a variety of other methods that can guide the behavior of innovators.

A Diverse Governance Toolkit

If you look on the back of whatever device you are reading this on, you are likely to see a UL symbol. UL stands for Underwriters Laboratories, a private company that “helps companies demonstrate safety, confirm compliance, enhance sustainability, manage transparency, deliver quality and performance, strengthen security, protect brand reputation, build workplace excellence, and advance societal wellbeing.” UL has over 21 billion marks on different products and over 1,614 current standards. They inspect and certify everything from toasters and microwaves to 3D printers and hoverboards. UL certification is seen as the standard of quality for technological innovations.

UL’s certification is valuable, not because of a legal mandate, but because it serves as a signal of quality. Someone who looks at a device with a UL certification knows that it has been rigorously tested by a trusted organization. While this certification may not give them legal recourse in the event that the product fails to meet expectations, it greatly reduces the chance of choosing risky products in the first place. And UL has strong incentivizes to employ a stringent testing process because its signaling function is only valuable if it has a good reputation that is associated with quality products.

Other, less commonly known private certifications also serve key roles as informal governance mechanisms in many different sectors of the economy. For example, 79 percent of entertainment electrician employers encourage their employees to obtain a credential from the Entertainment Technician Certification Program. While most people do not know that this certification even exists, it governs many of the individuals that hang the lights over your head at a concert. Meanwhile, home technology installation and repair experts are certified through a rigorous four-part credentialing process administered by CEDIA (Custom Electronic Design & Installation Association).

Certifications sometimes take the form of transparency and labeling efforts, such as the non-GMO certifications issued by organizations like the Non-GMO Project or the kosher certification issued by the Orthodox Union. Meanwhile, parents have long relied on content ratings for movies and video games that are privately-administered by the Motion Picture Association of America and the Entertainment Software Rating Board, respectively.

While industry standards and best practices are not binding legal requirements, they often affect a judge's determination of negligence whenever an accident occurs and a lawsuit follows. These third parties mechanisms have played an essential role in the governance of many different industries and are beginning to play an important role in other sectors. 

Case Study: AI and Robotics

Many trade associations and private organizations work with government agencies to formulate best practices and codes of conduct for other high-tech sectors and issues. In addition to UL, some of the most active groups in this regard include the Association of Computing Machinery (ACM), the Institute of Electrical and Electronics Engineers (IEEE), and the International Organization for Standardization (ISO). Recently, these organizations have turned their attention to the various policy concerns surrounding AI, machine-learning, and robotic systems.

For example, the ACM developed a Code of Ethics and Professional Conduct which has been updated over time to reflect changing needs and incorporate emerging technologies. The latest version of the ACM Code “affirms an obligation of computing professionals, both individually and collectively, to use their skills for the benefit of society, its members, and the environment surrounding them,” and insists that computing professionals “should consider whether the results of their efforts will respect diversity, will be used in socially responsible ways, will meet social needs, and will be broadly accessible.” The document also stresses how

“An essential aim of computing professionals is to minimize negative consequences of computing, including threats to health, safety, personal security, and privacy. When the interests of multiple groups conflict, the needs of those less advantaged should be given increased attention and priority.”

Others have worked on developing industry standards, codes of conduct, and best practices that innovators can use when developing new technologies. These are some of the efforts already underway in the field of artificial intelligence and machine-learning:

  • The IEEE’s “Ethically Aligned Design” project is an effort to craft “A Vision for Prioritizing Human Wellbeing with Artificial Intelligence and Autonomous Systems.” IEEE already has over 420,000 members and spans more than 160 countries. Their goal is to ensure that innovators are keeping in mind five principles when they are developing AI: human rights, better well-being metrics, designer accountability, systems transparency, and efforts to minimize misuse of these technologies. IEEE’s most recent report contained 263 pages with recommendations on best practices to satisfy each of those objectives.
  • The Partnership on AI began as an effort by innovators from Apple, Amazon, Google, Facebook, IBM, and Microsoft, and now includes over 80 members from the private, non-profit, and public sectors. It is a multistakeholder organization that works “to study and formulate best practices on AI, to advance the public’s understanding of AI, and to provide a platform for open collaboration between all those involved in, and affected by, the development and deployment of AI technologies.”
  • OpenAI is a non-profit research organization that disperses information with the goal of ensuring that AI “is used for the benefit of all, and to avoid enabling uses of AI or (artificial general intelligence) that harm humanity” and to ensure it does not become “a competitive race without time for adequate safety precautions.” OpenAI works closely with the Partnership on AI.
  • The British Standards Institute created a “Guide to the Ethical Design and Application of Robots and Robotic Systems.” This document “recognizes that potential ethical hazards arise from the growing number of robots and autonomous systems being used in everyday life” and its goal is to “eliminate or reduce the risks associated with these ethical hazards to an acceptable level.” It mainly focuses on best practices that innovators can use to create protective measures.
  • Another important player is the International Organization for Standardization (ISO), which aims to create international norms for emerging technologies. The ISO “is an independent, non-governmental international organization with a membership of 163 national standards bodies” that brings various individuals and groups together to establish global consensus on specific standards. The ISO taps into a range of experts from different sectors. One of ISO’s main projects was the productions of guidelines “for the inherently safe design, protective measures, and information for use of personal care robots.”

These third party efforts can work hand in hand with government to achieve the optimal balance between formal rules and informal norms. Efforts like these have also become more prevalent in other sectors such as drones and biotechnology.

Advantages of Private Governance Mechanisms

Codes of conduct, voluntary standards, professional ethical codes, and private certifications will not solve all of the problems that stem from the development of new and emerging technologies. Each situation will need to be addressed on a case-by-case basis with solutions ranging from “soft law” governance mechanisms all the way to binding rules in some circumstances.  

However, private efforts like these can play an important role when it comes to encouraging accountability and responsibility among innovators developing new technologies. There are times when they can even be more effective than formal rules at changing what individuals actually do. We should not underestimate the power that reputation has to shape how individuals behave. There are also reasons to believe that affected parties might be more receptive to these softer touch efforts. Another major benefit is the ability of these informal governance mechanisms to quickly adapt and potentially overcome, or at least mitigate, the “pacing problem.”

Finally, these governing mechanisms have a potentially global reach. Creating enforceable international standards is a difficult task; however, these multistakeholder processes at least have the ability to build consensus through negotiations and the continuous development of codes, standards, and best practices.

Because technological innovation is the fundamental driver of economic growth, social prosperity, and human well-being, it is important that society strikes the right balance regarding how new technologies are governed. All-or-nothing approaches are not usually optimal; some rules are always needed. But top-down laws and micro-managing regulatory regimes need not be our first governance option. A diverse mosaic of options exists, including many important private governance tools and methods, that deserve to be given a chance before resorting to more heavy-handed approaches.

Photo credit: Justin Sullivan/Getty Images