Mark Zuckerberg Confirms: Regulation Would Be Good for Facebook
After a year of scandal and criticism, Facebook is now calling for regulations on internet companies. CEO Mark Zuckerberg penned an op-ed in the Washington Post outlining four areas for lawmakers to target: speech controls, online political activity, data privacy, and data portability.
Has Facebook suddenly seen the light, as some observers hope, or is this pivot a classic case of regulatory capture?
I’ve pointed out before why regulation would be in Facebook’s interest. Although the company opposed measures like the EU’s General Data Protection Regulation (GDPR), it ended up benefitting from the rules because smaller competitors can’t shoulder compliance costs like well-lawyered Facebook can.
New regulations would likely also redound to Facebook’s benefit, especially if lawmakers take up the framing that Zuckerberg outlines. Let’s look at how Facebook could stand to gain from these proposed new rules.
New Rules on Content Moderation
When it comes to user-submitted content, Facebook just can’t seem to win.
If the company is too quick to take down content that some users find to be insensitive or dangerous, other users will call Facebook censorious. If the company is too slow in taking down violent content, like videos of the recent shooting in Christchurch, New Zealand, then the company will be accused of aiding and abetting crime.
The company has tried to square this circle by building automated tools to immediately pull down flagged content and hiring a contracted army of moderators. Zuckerberg’s piece describes Facebook’s “independent [appeal] body,” kind of like a private Supreme Court, where users can petition for a content decision to be overturned. None of these remedies have solved the deeper tensions between free speech and content control.
It’s not an enviable position. It’s also one that Facebook may not want to reconcile on its own. Zuckerberg now calls for regulation to “set baselines for what’s prohibited and require companies to build systems for keeping harmful content to a bare minimum.” After this “standardized approach” has been outlined by governments, then “third-party bodies” could “set standards governing the distribution of harmful content and to measure companies against those standards.”
Of course, outsourcing content moderation standards to governments and nonprofits does not solve the problem of scale. It just transfers responsibility to someone else. Facebook and other giants could publish “transparency reports” that measure their progress according to these standards as Zuckerberg suggests. But the standards that are set could still be too onerous or too lax, which does nothing to solve the underlying problem.
Compare this approach to one where technology companies build tools to make the user experience more granular. Take the problem of harassment. Which is more likely to cut down on brigading and stalking: publishing compliance reports on the number of posts a platform removes or building better tools for blocking bad users? One approach limits liability, the other enhances usability.
By the way, global content regulations could prove better for Facebook than many national policies in effect. My colleague Michael Kotrous pointed out to me that it could override a German law that holds online platforms liable for failing to take down certain content within 24 hours. The EU is considering a similar law on terrorist content.
There is a possible anti-competitive element here as well.
Facebook and existing tech giants already have pretty good content moderation tools. It would be fairly easy for them to comply with content standards, and their transparency reports would likely yield good numbers. This could be good for business: Zuckerberg compared these transparency reports to company financial reports in his article.
Competitors might not have the tools and experience with content moderation that companies like Facebook have, so their reports and reputations could preemptively suffer. This could make it harder for upstarts to attract talent and capital. Content moderation regulation could, therefore, have the unintended consequence of cementing current market leaders.
Regulations on Political Activities
In the United States, many have criticized the platform’s open posting and advertising policies for enabling foreign meddling in the 2016 presidential election. Leaders in other countries, like France, also worry that uncontrolled political posting could sway election results.
The fear is that undesirable parties could sway people’s opinions in ways that targets may not discern. Even small efforts elicit great concern.
For instance, Russian-linked operatives are believed to have spent around $100,000 on sponsored posts with “divisive messages” on topics like police brutality and gun control. Facebook has reported that Russia’s Internet Research Agency operated non-ad accounts as well, and their posts were followed by almost two million people. The amount of money and followers may seem minuscule, but some worry that these modest efforts paid off handsomely in the form of impressions. At the very least, it could indicate a potential social vulnerability.
Facebook has tried to address these vulnerabilities by requiring advertisers to verify their identities in some countries and building tools for users to see who pays for what ads. But as with general content moderation, it ultimately falls on Facebook to determine what political speech should be scrutinized or permitted.
Again, the company wants to pass the buck.
Zuckerberg writes, “our systems would be more effective if regulation created common standards for verifying political actors.” This would give the company an external standard to measure up to. Then, if unforeseen problems arise despite strict compliance with the letter of the law, it would not be Facebook’s fault. Rather, the laws would be the problem.
It may also handicap competitors. In 2011, Facebook attorneys argued that requiring such disclaimers for ads would be “inconvenient and impracticable,” and so should be exempt from Federal Election Commission (FEC) regulations. Perhaps this is no longer the case for the company, but such burdens may still be insurmountable for smaller companies.
Governments have an interest in maintaining the integrity of their elections. Leaders may want to consider how to shore up their electoral policies to keep up in the digital age. (The situation is more complicated for non-ad political speech since FEC rules generally target campaign finance.) But it’s worth noting that Zuckerberg’s call for government-mandated data transparency could be at odds with his next proposal: global data privacy.
Global Privacy Regulation
Some may be surprised to see Facebook, thought of as an especially bad actor on privacy matters, come out in support of global privacy regulation. The company publicly opposed the EU’s General Data Protection Regulation. It also is courting federal lawmakers to preempt California’s similar law, the California Consumer Privacy Act (CCPA), before it goes into effect next year.
Has Facebook had a change of heart? Zuckerberg writes, “it would be good for the Internet if more countries adopted regulation such as GDPR as a common framework.”
Well, it might be good for Facebook at least.
It’s not that Facebook was thrilled about the GDPR. If companies can avoid having to curtail profitable activities and hire armies of lawyers to stay true to the letter of the law, they usually prefer that.
But regulations can have the serendipitous effect of handicapping competition as well. Consider that after the GDPR was passed, the market share for advertising giants like Facebook and Google shot up overnight.
Data Portability Guarantees
The last plank of Zuckerberg’s plan for a new internet was either a snoozer or a head-scratcher for many. He writes: “If you share data with one service, you should be able to move it to another. This gives people choice and enables developers to innovate and compete.” He believes that regulations must guarantee this ability.
This goal is not objectionable. But you may wonder why regulation is required to achieve it, and why Facebook of all companies is getting behind it.
Shortly after calling for state intervention, Zuckerberg mentions his company’s support for the Data Transfer Project, which is an open source effort to standardize data transfer formats so that users can easily move from platform to platform. If private efforts are underway, why is the state needed? And why does Facebook care?
Few would have called Facebook a champion of data portability before this. In fact, critics worry that its (perhaps poor) stewardship of “our data” gives it a “lock in” advantage over competitors.
While Facebook does offer tools to export some data to other platforms, it closely guards each person’s “social graph” from other services. This network of connections between people is part of what makes Facebook so effective and profitable. It’s not surprising that it would want to keep social graph data under wraps, but it seems at odds with the company’s new pivot to embracing data portability.
Some believe this is a defensive move. Facebook also owns Instagram and WhatsApp, and recently drew antitrust criticism for more tightly integrating messaging services across those platforms. But if users are sharing data across all social network platforms, then maybe Facebook could escape serious scrutiny.
There is a more straightforward rationale. Facebook already operates the largest social network ecosystem in the world. A global data portability requirement would fall on its competitors as well. This means they would have to build tools for their users to migrate to Facebook. Because Facebook already has the biggest network, it is more likely that people will migrate to one of Facebook’s platforms than the other way around. And because of the complex way that “data” is considered in law—even the GDPR leaves social graphs alone—Facebook would not be required to share its competitive edge with others.
More Regulation, Less Competition?
Zuckerberg’s op-ed was short and seemingly straightforward. But there is a lot to unpack in the CEO’s four simple regulatory requests.
In each case, the company benefits from the glow of good press for seeming to selflessly call on the government to restrain itself. Yet as we have seen, Facebook may not have wholly altruistic reasons in calling for government regulations. These proposals could weaken or kill competitors while strengthening its own market dominance and make it more difficult for the next Facebook to get out of a dorm room.
Photo credit: Lawrence Jackson /White House