Here's How to Regulate Facebook Productively!

Hint: don't draw up rules that solidify the market power of tech titans.

Privacy and social media have always had a complicated relationship. On the one hand, users of sites like Facebook often voice concerns when they find out how much of their personal information ends up with advertisers. On the other hand, the roughly 2 billion active users of Facebook continue to provide that data voluntarily in exchange for free use of the platform.

Add political advertising to that mix, and you have the ingredients for intense Congressional hearings featuring Facebook CEO Mark Zuckerberg, and a national conversation about whether or not the federal government should take a larger role in social media regulation.

Here to address some of those challenges are three experts in the field:

Download this episode and subscribe to the Mercatus Policy Download on Apple Podcasts or wherever you listen to your favorite podcasts.

Looking to connect with a scholar you heard on the Download? Email Kate De Lanoy at [email protected].

Note: While transcripts are lightly edited, they are not rigorously proofed for accuracy. If you notice an error, please reach out to [email protected].

REESE: Welcome to the Mercatus Center Policy Download. I'm your host, Chad Reese. Privacy and social media have always had a complicated relationship. On the one hand, users of sites like Facebook often voice concerns when they find out how much of their personal information ends up with advertisers.

On the other hand, the roughly two billion active users of Facebook continue to provide that data voluntarily in exchange for free use of the platform. Add political advertising to the mix, and you have the ingredients for an intense Congressional hearing featuring Facebook CEO Mark Zuckerberg and a national conversation about whether or not the federal government should take a larger role in social media regulation.

Here to address some of those challenges are three experts in the field. First, we have Mercatus Center senior research fellow Adam Thierer. Welcome to the show, Adam.

THIERER: Thanks for having me.

REESE: Next, legal fellow Ashkhen Kazaryan is here with us from TechFreedom. Thanks for stopping by.

KAZARYAN: Thank you for having me.

REESE: Third, we have American Action Forum's director of technology and innovation policy, Will Rinehart. Good to see you, Will.

RINEHART: Thanks for having me.

REESE: I wanted to start with the basics. Online advertising has been around for a long time. Advertisers basically always want more information about their customers, but we're talking about this now. Can someone briefly walk me through maybe the timeline of events or what got us to the point that Mark Zuckerberg is testifying before half of the US Senate?

KAZARYAN: 2008 is when Sheryl Sandberg joined Facebook and actually started monetizing on advertising on the platform. It's also around that time that they've allowed third‑party apps to be on the platform. Around 2011, Cambridge Analytica, the company that has been doing data analytics and advising political campaigns across the world, had been purchasing data including data from this app that was...

RINEHART: It was run by this privacy researcher who was marginally connected to the University of Cambridge, named Cogan. It seems like he had set up an app to effectively collect information.

It seems like not a lot of people actually downloaded it, but how it actually got into the ether was that Cogan basically used money that seemed to come from Cambridge Analytica to pay people to use Mechanical Turk to download the app, and then just connected to them, and that's how they got paid, was just downloading the app.

That action, because the way that the API at the time actually worked is that if you downloaded the app, it was pretty well known ‑‑ I was actually doing research in grad school on this point ‑‑ that if you connected to the app, the app could effectively get information about all of your network.

It could get information about your friends through the network and a whole bunch of demographic information. Which is what Cogan seems to have accessed, which is also thus how Cambridge Analytica got this data.

Which I think is pretty important, because...It wasn't uncommon for something like this to happen, but really what was of concern was there's this handoff between Cogan, who had used this app for one specific purpose, or it seems he was using it for one specific purpose, and then they transferred it to Cambridge Analytica for quite another purpose.

KAZARYAN: The main problem with this also was that, OK, you download the app ‑‑ you consent even if you didn't read the privacy agreement. However, it collects also your friends' information, who have not consented to it.

That was the big catch. That was why 300 users actually spread out, and I believe Mark Zuckerberg during the hearings in Congress a few weeks ago admitted that his data was affected too in this whole ordeal.

REESE: Mine was not, by the way.

KAZARYAN: Did you check?

REESE: Oh yes, I did check. Oh, I've got good friends.

[laughter]

REESE: Apparently, I keep good company. If I can just try to summarize real quick to make sure we're on the same page. It sounds like Facebook has lots of third‑party developers that have their own apps. One of these apps was generated for one purpose.

The person who was running that app took the data that not only came from the people who used that particular app, but anything that was connected to them via their other Facebook friends and then sold that off to other folks. Is that the story?

REESE: It seems so. Whether or not he was specifically paid for specifically that data I think is actually kind of still an open question.

KAZARYAN: Either way, though, we have data brokers. It's not something that is banned or morally wrong. I think just a lot of people are not aware of it or just don't even know how it operates. We have data brokers who sell our information that we provide.

I believe there was a study where they analyzed a bunch of different free apps that we download, and the value that we get out of them. We value our main private information for about $1.49.

REESE: Really, that cheaply.

KAZARYAN: Yeah.

REESE: That's actually where I'd like us to go here, because whether or not people want to blame Facebook or third‑party app developers, the real question is what the policy response here is. I've seen just in my own reading maybe two general approaches.

One, maybe we review the disclaimers and we make sure people know what they're getting into. The other approach is to regulate social media platforms like Facebook. Where do you all see the policy conversation heading, and where should the policy conversation be heading?

THIERER: I think we're embarking on a pretty ambitious new regulatory regime for social media platforms, both in this country and abroad, as data handlers and social media platforms come under greater scrutiny by regulators across the globe.

Facebook's at the center of this because of their size and scope, but of course, you have Google and many other players that are involved. I think what's going to happen is that you're going to see Congress start to get more aggressive about investigating exactly how Facebook and other social media platforms collect and share data. The question really becomes, how specific do those regulations get?

Then, more problematically from my perspective, not the impact that they might have on Facebook or Google, but the impact they might have on the marketplace, competition, and innovation more generally.

KAZARYAN: The regulation of this Big Tech that everyone keeps talking about actually wouldn't hurt the Big Tech at all. Exactly what Adam was saying. It's going to affect small companies that exist right now, or the ones that are going to come up.

Same thing with GDPR, the privacy regulation from EU that is going to take effect on May 25th. If you noticed your inbox get very full of emails about updating the privacy policies from every major platform that you use, every account that you have, it's because of this GDPR.

These privacy rules, Facebook can easily comply with them. Almost every other major platform, Google, will comply with them with some difficulty, but still they can do it. Whereas smaller companies in US, if they are obliged to fit into that new mold of privacy regulation, they will be the ones who are going to be financially hurt.

REESE: Yeah, clearly there's a cost to compliance. Whenever you raise the fixed costs, it's just going to make it even...reiterating what everyone else has said. The policy responses I think have been kind of interesting.

To go back to this original question, the policy responses have been all over the board. You have the GDPR type of regulating use of data. There's also a general conversation of, "Maybe we should just break up these firms. We should just hatchet them to death, effectively."

There's also a series of other sorts of regulations. Maybe we should have some privacy to what has been discussed and some of the other stuff that I know you worked on, which is these extra added responsibilities for large tech firms.

Everyone is trying to figure out right now unique ways of dealing with companies and potential problems and nefarious behavior. Personally, I see that, at least in the near term, the market for reputation is probably going to be the biggest check on some of these firms.

Facebook has changed quite dramatically, even in the last two weeks. They announced that you now are able to ‑‑ it was yesterday ‑‑ effectively delete a lot of your history. They have given a whole new slew of controls over your past privacy and history.

I'm positive at least in...The first order of operations for me is really that this market for reputation actually is a pretty important competitive check, that companies clearly are responsive to how individuals perceive them in the marketplace.

KAZARYAN: I'd like to highlight, though, that Facebook is a very specific example. They're in the middle of a storm. We keep thinking about Facebook and thinking that other tech platforms are like Facebook. Facebook is very special in many ways, including the fact that they see themselves not just as a company that is there to make profit. They see that they have a mission.

A lot of the times, even during the hearings, the CEO, Zuckerberg, would say, "You know what? We're not against regulation," or, "You know what? We think there should be some way to rein us in." It's because they see themselves as just the social good, I don't know, doves of bringing peace and prosperity to the world, connecting the world, and having this higher mission, which is not a bad thing.

It's a great thing. The fact that they are making all of these submissions and submitting to Capitol Hill being the teacher and saying, "Aye, aye, aye, aye," is the reason the whole tech industry is going to have to follow up. It's the way I see it.

RINEHART: This is something that I, not to reference another podcast...

REESE: By all means.

RINEHART: [laughs]

REESE: As long as it's a good one. That's my only rule.

RINEHART: Yes. "Exponent Podcast," which is probably one of my favorite podcasts, it's generally tech. They talk a lot about this internal organization of companies. This is actually one of the big things they've talked about with Facebook.

Facebook sees itself as a platform, more than anything else. That at least puts a certain direction to the company. In some cases, it has created some of these problems. The original purpose of the opening API was to create a platform. It actually worked pretty well. This is how Facebook became Facebook.

Before the Apple iTunes Store, it's important to remember Facebook was the first one to open up and effectively create a platform in the way that we think of platforms today and the open application...

REESE: You mean where these third‑party developers...

RINEHART: Exactly.

REESE: can create their own apps...

[crosstalk]

RINEHART: The iPhone App Store came out after Facebook's own effective app store came out. Their application layer came out. It's very clear in some really interesting interviews with Steve Jobs.

He actually recognized very clearly what Facebook had created. Facebook, in this sense ‑‑ mentioning Sheryl Sandberg earlier in the podcast ‑‑ this was something that they actually helped to develop in a very clear way.

Going back to this Cambridge Analytica problem, this is something that I don't really know how to place because now we're in 2018. The conversation in 2009, 2010, even into 2014, is just very, very different. The kinds of privacy conversations we had back then, I wouldn't want to say they're nonexistent, but it was still a very niched area.

Not a lot of people...Adam was working on a lot of this stuff. For the most part, it wasn't a constant conversation point in the way that it is now. Understanding that context, back in 2009 to 2014, it's very key to understand that the conversation back then was very different than it is now.

REESE: We can spend a little bit of time here, too, because, Adam, you brought Google up earlier. You mentioned what a lot of people are talking about, which is this fear of tech giants in general, concern about, "Should we break them up?" Ash, you even mentioned Facebook even talks about itself as something more than just a regular company.

I'm interested in how you all will respond to this concern. Maybe this is broader than Facebook or social media. There are some tech firms that have just gotten too large, have too much control over the advertising space, and have too much say over the kinds of information we see.

Is there something to that argument that maybe Facebook didn't do too much wrong in a sense? These problems are just going to happen, where you only have a couple of big tech giants and the policy solution is to maybe try to find a way to break them up.

THIERER: In every generation of media technology, there are large companies and platforms. We are blessed today to have more large companies and platforms than ever before in the history of humanity.

We have more choices as consumers, as speakers, and everything else, such that even if Facebook and Google are some of the biggest dogs in the ecosystem today, we still have other alternatives to choose from.

It does not mean we shouldn't be worried about companies when they grow too large, and we shouldn't continue to keep an eye on some of their behavior. It's just to say that the better way to address those concerns is to encourage still more competition and innovation.

A lot of people today, especially in the academy, seem to have given up on the idea that the old waves of creative destruction could ever blow through this sector again and decimate Facebook and Google, the way it decimated the first and second generation of online platforms and players.

We no longer take seriously companies we hear from the past like MySpace, AOL, Time Warner, and so on and so forth, who at one point in time, were the big tech titans of their time. That was just 15 to 20 years ago.

It used to take many, many, many decades to actually undo the market power of large "tech" or "media titans." Now these things seem to happen every decade or so. I would say we should be a little bit more patient, see if we can build some new platforms, and get some competition to Facebook.

My biggest fear is that all the regulation that's being contemplated today, in the name of going after Facebook and Google and others, is ultimately just going to only solidify their market power because they'll be the only companies large enough and with enough resources to actually go about complying with heavy regulatory costs that would be imposed by those laws.

REESE: To peel back on that, one of the questions I always come back to is, "What's the goal? What's the goal of your preferred regulation?" If the goal is just to get more competitors into space, what does that necessarily mean? If we're specifically talking about Google and Facebook, individuals that are using these services don't effectively pay anything.

If we're talking about a reduction in price in the way that we typically talk about monopoly problems, then we're not really in the same space here when we're talking about these kind of online platforms. I think that actually highlights the complex nature of online platforms as taking into consideration two sides of the market.

Which is, you have consumers, or users, but you also have advertisers who are coming to the platform in order to connect to users. Also, just to have a little bit of perspective, it seems to me to be odd that we're really talking at the end of the day about a competitive ad space.

At the end of the day, that's really what we're getting down to, that Facebook and Google aren't competitive when it comes to ads. Now, we are seeing quite dramatically over the next couple years ‑‑ and it's well known within the industry ‑‑ that Amazon's getting pretty big into this space.

They're quickly becoming the third advertiser within the space. Again, we're talking about balancing. Each one of these business balances between two different kinds of users, or between two different types of groups, one group being advertisers, and the other group being the consumers.

To me, it just, I see a lot of...I'm actively working on a lot of research in space, primarily because we still don't know what to do. We don't know what competition should look like. We don't really know. We need to have more, so what's the best number of firms that should be in the marketplace is always n plus one.

That doesn't really seem to be...It shouldn't necessarily be the end goal. I still think that we need to define what we mean by competition, and really define what we actually want at the end of the day, what the end goal should be.

I honestly think that a lot of people who would reflexively want regulation don't necessarily know what that regulation is going to bear. I think that's actually a very, very big problem here.

KAZARYAN: To piggyback on what Will said, during the hearings, one of my favorite moments was when one of the senators, quite of age, asked Zuckerberg, "Who's your competitor?" Then you see Zuckerberg, who was well prepared, and obviously a bunch of people prepared him for days, just get lost, and just be like, "What?"

[laughs] "What do you mean? There are a lot of competitors. The whole Internet is our competitor. All the other platforms, Twitter." He kept bringing this up. He kept saying, "Depending on what exactly the user is coming to Facebook for, we have a different competitor."

That answer didn't really satisfy the person who was asking it. The truth is, Facebook has all this data about my behavior online. I recently looked up, you know the categories that they put you under?

RINEHART: Oh, yeah.

KAZARYAN: It knew that I live with roommates, that I'm expat, and that I speak more than three languages. All of those things that are really creepy, but if you think about it, the amount of times, amount of things I click on, and amount of time I spend on Facebook, even when I'm not really on Facebook...

I use Facebook mostly to keep up with relatives. I don't really use it anymore to keep up with friends. There are other platforms for that. It altogether, I'm like, I don't really want Facebook to be broken up. I don't want multiple platforms to have all that information on me.

I'd rather have one big company that I trust, more or less. I know they have a huge team of lawyers who are very good at what they do and are trying to comply with every single law. They, again, have this philosophy about the reason they are here.

Than have multiple platforms, and to go to Platform X to look up reviews of different restaurants, Platform X to communicate with your friends, and Platform C to tell your mom you're OK. It makes no sense to me.

If this was some kind of other industry, where it was railroads, or I don't know, producing toys, or whatever it was, I might have been more open to listening to arguments about breaking it up. Then it creates, as you said, different pricing. It creates different options.

Here, no one is stopping other platforms from coming. What Adam was saying is, MySpace and AOL were a thing. I don't remember, because I was too young for that.

[laughter]

KAZARYAN: I'm sure they were a thing. In 10 years, I'll be too old for that new platform, and I'll be asking my interns to help me figure it out.

REESE: That's a really interesting component, because a lot of what we've been talking so far is stewarding data, and where companies put it intentionally. Ash, you bring up data security as a cyber security issue.

Do you all agree, is that a fair way to think about it, is that there are economies of scale here, and to a certain extent, having larger firms in the space is better for consumers, just because those companies can afford better cyber security?

THIERER: There's always, again, going to be large players in every tech sector and media era. You can have smaller players that specialize in privacy and security, and offer even greater types of security and privacy.

Certainly, that would be easy compared to Facebook, which is essentially a digital nudist colony. It exists for you to share. It does not exist for you to be private. By definition, when you go to Facebook, I believe there should be a digital caveat emptor kind of thing, where you just know what you're getting into.

I have very little sympathy for people who cry about not having a lot of privacy and security on Facebook. That's not what you're getting there. What you are getting is a wonderful platform to share your experiences with the world, with your family and friends, and to make a lot of great connections, and you're getting it for free.

This is a remarkable consumer welfare victory. When we hear this talk about regulating Facebook on either privacy or antitrust grounds, I usually respond by saying A, if you're looking for privacy, Facebook ain't your site. Go find another.

B, if you're worried about market power, let's keep in mind that we have always regulated, when we talk about consumer welfare, to achieve lower prices, higher quality, better quantity. We're getting all of that and more. What do you want to do with regulation?

What ironically, a lot of people want to do is they want to raise the price of Facebook. They want to make people pay for those services. That doesn't seem like a consumer welfare victory to me. That seems like a loss.

If they want to increase quantity, fine, but you don't do it by regulating the entire sector into the ground. You certainly aren't going to get higher quality if you start imposing those controls, either. We need to think hard about how we create alternative platforms, not how we optimize yesterday's old ones, and squeeze more juice out of an old lemon.

KAZARYAN: A few things on that. First of all, for example, Google tracks a lot of things you put in, even when you're in do not track mode, but there's alternatives to that. There's Duck Duck Go. There are a bunch of other things you can do if you research a little bit.

Although I suggested regulations that we've talked about in the past few months, they just assume that consumers are stupid. If we look at the Cambridge Analytica scandal, so many people have been deleting Facebook.

They've been making a conscious choice, and that's exactly what happens in a free market. You make a choice. Are you OK with this information being out there or not? Is the trade‑off of being able to see cute puppy videos that your friend from college posts, is it worth it? I think yes, but everyone can make that choice.

RINEHART: Going back to at least this pricing issue, which Adam I brought up, I think that he either has worked with Mercatus? I know he was working with you on a paper, which is Caleb Fuller's recent piece.

He actually tried to suss out, what was the value, if we were to effectively make Google specifically a paid‑for platform, how much would people actually pay for it? He estimated, I think it was, about $14 million or $15 million per year is about how much they could get.

Market capitalization currently was at, like, $215 billion. People are not exactly willing to pay a lot for these services, it seems like, but they clearly are getting a huge amount of consumer value. Back to the, at least to sum up some of these real big problems within privacy, I think Adam's exactly right.

Which is, you don't go to Twitter, and you don't go to Facebook specifically for...You don't go there to be private individuals. You go there to engage in a semi‑public series of engagements with other individuals.

This has been a huge, huge benefit for individuals. On the privacy question, though, writ large, what would we get if we were to have either more competition is this space, or if you had different sorts of other platforms existing?

I don't necessarily see why privacy would be the thing that would be your competitive advantage over, say, a competitor, because again, we already have Duck Duck Go. We already have Ello, for example, was one of these social networking sites that advertised that it would be more privacy sensitive.

Again, it isn't necessarily that that individuals want. They do want the kind of content. At the end of the day, even though people do seem to harangue against news feed, they do like the content that comes through there.

These are, I think, really, really tough questions about the difference, really, between how people...This problem between revealed preference and stated preference. That people say they want privacy, but when they actually engage online, they seem not really to care all that much.

It's maybe just because it's a really, really minuscule thing. At the end of the day, what they really value are these services. To really put a fine point on this, I think it's probably instructive to at least talk about trust.

Again, this wasn't necessarily an issue almost 10 years ago. This wasn't an issue that people really talked about all that much, because they seemed to trust these sites more than they do now. I think that's probably what we're seeing, is this changeover not necessarily in, like a changeover in privacy concerns.

A changeover in trust. They can't trust these sites anymore. That is at least one way to interpret it.

THIERER: Just building on what Will said there, what people value with an experience like Facebook, and a lot of the rest of the modern Internet ecosystem are cost, convenience, and connectivity. Those are clearly things they put a great deal of value on, probably before they value privacy online, and security.

Now, that isn't necessarily all a good news story. There's probably too much sharing. There's probably too many security vulnerabilities on today's Internet that need to be addressed. That being said, as Will also suggested, people are not completely ignorant sheep.

They understand there is a quid pro quo, and they're willing to make some of these trade‑offs. That might include giving away a certain amount of information, or having sites that are a little bit vulnerable at the margins.

I think the problem with a lot of people in the academy and a lot of policymakers is that they have essentially an old false consciousness narrative, that people are just ignorant sheep being led to the cyber slaughter, and we've got to save them from their own devices.

REESE: Literally, in this case.

THIERER: Therefore, we should intervene. We got to basically intervene, and make sure they don't use Facebook in this way, or that Facebook's reconfigured, has different defaults, or is a much different site than we're used to.

That will change the cost, convenience, and connectivity part of the equation. The easiest way to deal with this or to survey this is just to go to any room, as I often do when I'm at law schools or philosophy programs and I'm debating these things.

People like free. They like easy connectivity. They love convenience. These are things that are values that should be discussed as important alongside of privacy, security, safety, so on and so forth.

It's really hard to weigh them. We should always try to do our best to optimize for all of them. We shouldn't try to force that upon consumers on the assumption that they're idiots.

KAZARYAN: That also brings us to a point that some of the critics of the current model say. They say, "People don't want to leave these platforms. However, the trust that they have given to these platforms is so great that we should change the model that we even operate it with."

Jack Balkin, professor at Yale Law School has been talking about information fiduciaries for quite a few years. Now, this concept has been picked up by some of the legislatures.

It basically equals the relationship you have with an online platform to a one you have with your doctor, your lawyer, or your bank, where they have such a bulk of sensitive data. Also, you come to them so often that they have this responsibility to protect it overly and always have your interests first and not act like a business, but act as your fiduciary first.

Some people think that this is the future of regulating the online platforms. It's just making sure that platforms have consumers interest first and then their own, as a business.

THIERER: I'm going to let Will get to that in a second. He's done a lot of great work on this. I'll say this. The primary problem I have with the idea of information fiduciarism is not the idea that Facebook and Google shouldn't be taking a special duty of care to handle data carefully. I like that idea, generally.

The idea of mandating that by law will have certain consequences we have to think through. We've got to start by realizing that my relationship with Facebook or Google is very different than my relationship, which is a one‑on‑one relationship, with my doctor, my lawyer, or my financial planner.

These are people who have a great deal more power over me and a great direct relationship with me and one that could result in financial, material, or health ruin. That is not Facebook where I go voluntarily and put information on voluntarily and can walk away from it any moment voluntarily.

Ultimately, no matter how "that the harm is", it ain't death.

[laughter]

THIERER: It isn't bankruptcy. It isn't legal ruin or whatever else. There is a qualitative difference in what we're talking about when we talk about the idea of fiduciary responsibilities in this context versus doctors, lawyers, financial planners.

RINEHART: I think Adam really summarized the clear problems with this idea among others. One of the big things that I worry also about is there's other sorts of responsibilities that are clearly impute within corporate contracts anyway. If you're Google or Facebook, you actually have a fiduciary responsibility within the corporate structure to your shareholders.

How do those two things conflict? Again, I like the entrepreneurial thinking of Balkin on this. I just think in practice, I don't really know what it looks like. I don't know what it looks like to necessarily have more responsibility over information that you're sending to individuals. What exactly is necessarily the harm of Facebook sending you content.

This really does open up a huge-

[crosstalk]

REESE: -of podcasting.

THIERER: It does.

KAZARYAN: What he likes to say is Google Maps shouldn't send you from point A to point B of a route that passes IHOP just because IHOP paid them $12 for that or whatever IHOP pays. I don't know. I've never been...

That's the idea where they shouldn't be putting that first. However, I don't understand it in a sense that that's for business. Not Google Maps, but that's just in general the way ads work. That's the way they make money. That's the way they stay afloat.

RINEHART: I guess the general point is the fiduciary responsibilities have usually been applied to these potentially problematic relationship. As Adam said, death. That's a pretty bad potential problem to have. Or bankruptcy, again, a really big problem to have.

Being served specific content, I don't know. That, to me, seems to...There are other less egregious ways to deal with this. It just seems to me that it create a lot more problems that it actually solves at the end of the day.

REESE: If there's some hesitation about this idea of information fiduciary, let's clear the slate for a second and go with one last lightning round before we wrap up here. Say each of you get a call from a regulator or a member of Congress.

They're well‑intentioned. They say, "My constituents are concerned about privacy and about these other issues that we've been talking about. What's an avenue that I should be exploring?" What would you tell them? What should they be spending their time on?

THIERER: I think it's useful to think about how we can help incentivize the creation of all new platforms. I'm not opposed to some creative thinking about how different types of parties can come together, including non‑profits and maybe even government organizations to try to incentivize that or fund it.

That's always going to be better. It's always better to build new platforms as opposed to try to regulate old ones. That's always a better option. In this case, we're just trying to figure out how to solve our problems of the past, which ultimately, aren't going to help us get to that goal of creating the diverse solutions that we need for a diverse citizenry.

If we want to take a specific stab at dealing with some of the particular problems on Facebook, we already have existing legal remedies and organizations that do a lot of this. We have the US Federal Trade Commission, which enforces unfair and deceptive practices, which almost certainly is looking into this right now what happened with Facebook. We have State Attorney's General that deal with this sort of thing. Of course, international authorities.

The ongoing oversight part of it, I have no problem with. When you get down to the hard question about what specific regulation you want to put in place, that's where the well‑intentioned rules that some envision to enforce more privacy or security ultimately could backfire on us and lead us to a situation where we have less competition and choices as a result.

RINEHART: I personally think this is a real missed opportunity for some of the biggest platforms to come together and buying themselves in a different way. I don't want to say that it's self‑regulation, but it is sort of. It was an opportunity to effectively create an organization or to create a set of practices that these platforms or firms would abide by.

I still think that's probably going to be the best way of dealing with these problems, which are pretty endemic within the Internet. We have the Internet engineering task force, which deals with setting protocols. We have 3GPP, which deals with wireless protocols.

There's a whole bunch of different organizations that have come together through the Internet. Something very similar happened with behavioral advertisements in 2009, 2010. This could have been an opportunity for the largest companies to come together to figure out.

We're going to try to deal with this privacy question in a very meaningful way. We're going to work to create almost an ombudsman for our own benefit. We're going to take the first initial step.

Now, that obviously opens up the space for potential more regulation, which could be especially heavy‑handed, that effectively, the government create something akin to that, which I think is actually a very worrying potential end result.

I still think that there are opportunities, at least for the organizations themselves to come together and show that they're willing to be good actors in this space and they're willing to deal with problems, which they have been in the past.

Google, Facebook, Twitter, Reddit ‑‑ a lot of these social media platforms have actually done a pretty good job of when things have been presented, they deal with it in a meaningful way. If they were to show a more unified face on this, that actually could go a long way in helping it.

At the same time, there's only really one company that's currently in the crosshairs right now. That really just seems to be Facebook. Given that it's only one company, you have to have a certain amount of policy leadership there. I don't necessarily see that happening because Zuckerberg is pretty OK with being regulated.

KAZARYAN: Since I'm going last and Adam and Will already covered what I was going to say...

[laughter]

KAZARYAN: Another very important part of what our representatives should do ‑‑ not really are doing ‑‑ is educating their constituents and talking to the constituents and understanding the issue before they do that.

I would say, "Well, thank you for calling me. Let me walk you through all these regulations and break down the problems that we have with Facebook." One thing is election interference. Cambridge Analytica political scandal had reached the level of the storm that it did because also of the Trump campaign used their services.

Other campaigns across the world that were not so popular with the liberal crowd have done the same. I would say, "Let's separate that into one bucket and talk about the regulation and the checks and balances we have in place for that."

Then, let's talk about privacy and how your privacy that is protected through Fourth Amendment is really not the same thing than your consumer privacy. Privacy in US exists in these verticals, not one horizontal right.

Then, let's talk about all the ads and advertising and what you see and how you are able to control it. Maybe there should be some better mechanism for platforms to create that you are in charge of that. Those are different things.

Every person who calls up their congressman or is worried about this should be able to fully understand and explain them back before they make a decision and vote for a certain representative in the next midterm elections.

What I've heard during the hearings this spring was that the Honest Ads Act, the act that is going to regulate political advertising on platforms, basically is something that law makers want to push through before the midterms.

That's going to be something we should all keep an eye on, because that's the next best thing for platforms to be afraid. It's also going to affect free speech and a bunch of other legal matters that are going to be a part of a constitutional structure that we have.

REESE: Thank you all for those recommendations. Hopefully we've done some educating ourselves today. We like to wrap up by letting our guests inform our listeners where we can find your work online.

If you have a Twitter handle or a bio page that you'd like to share with our listeners, now would be the time to do it. I'll just quickly go around the table starting with Adam.

THIERER: I'm on Twitter at @adamthierer. I'm at the Mercatus Center website if they want to find more papers and information.

REESE: Will?

RINEHART: I'm @willrinehart on Twitter. Pretty simple. No underscores or anything like that. You can find all of my work through the American Action Forum. Just go up to the top. It's americanactionforum.org. Click on experts. You can find me right there.

REESE: Ash?

KAZARYAN: Twitter @ashkhen. Also, techfreedom.org has most of my work I've done on tech policy. I also have a tech policy podcast that we host and release every Monday. We'd be happy if you guys take a listen.

REESE: Sounds good. As always, you can reach me with your questions, comments, or complaints at [email protected] or find me on Twitter @chadmreese.

Thank you all for joining us.

THIERER: Thank you.

KAZARYAN: Thanks for having us.

RINEHART: Thanks.

(Photo courtesy of Anthony Quintano)

About Mercatus Policy Download

Looking for smart policy ideas for a growing world? Subscribe to the Mercatus Policy Download for all policy, no punditry, and a path forward.