Everyone talks about the need to provide affordable broadband to all Americans. This includes not only finding ways to get networks deployed in rural areas on par with those in urban areas. As a recent study showed, more urban folks are locked out of home broadband by factors such as price than do without broadband because of the lack of a local access network. The simplest answer would be to simply include broadband (both residential and commercial) in the existing Universal Service Fund. Indeed, Rep. Doris Matsui has been trying to do this for about a decade. But, of course, no one wants to impose a (gasp!) tax on broadband, so this goes nowhere.
Following the Washington maxim “don’t tax you, don’t tax me, tax that fellow behind the tree,” lots of people come up with ideas of how to tax folks they hate or compete against. This usually includes streaming services such as Netflix, but these days is more likely to include social media — particularly Facebook. The theory being that “we want to tax our competitors, “or “we hates Facebook precious!” Um, I mean “these services consume more bandwidth or otherwise disproportionately benefit from the Internet.” While this particular idea is both highly ridiculous (we all benefit from the Internet, and things like cloud storage take up more bandwidth than streaming services like Netflix) and somewhat difficult – if not impossible — to implement in any way related to network usage (which is the justification), it did get me thinking about what sort of a tax on Silicon Valley (and others) might make sense from a social policy perspective.
What about a tax on the sale of personal information, including the use of personal information for ad placement? To be clear, I’m not talking about a tax on collecting information or on using the information collected. I’m talking a tax on two-types of commercial transactions; selling information about individuals to third parties, or indirectly selling information to third parties via targeted advertising. It would be sort of a carbon tax for privacy pollution. We could even give “credits” for companies that reduce the amount of personal information that they collect (although I’m not sure we want to allow firms to trade them). We could have additional fines for data breaches the way we do for other toxic waste spills that require clean up.
Update: I’m apparently not the first person to think of something like this, although I’ve expanded it a bit to address privacy generally and not just targeted advertising. As Tim Karr pointed out in the comments, Free Press got here ahead of me back in February — although with a more limited proposed tax on targeted advertising. Also, Paul Roemer wrote an op ed on this in the NYT last May. I have some real problem with the Roemer piece, since he seems to think that an even more limited tax on targeted advertising is enough to address all the social problems and we should forget about either regulation or antitrust. Sorry, but just as no one serious about global climate change thinks a carbon tax alone will do the trick, no one serious about consumer protection and competition should imagine that a privacy pollution tax alone is going to force these companies to change their business models. This is a push in the right direction, not the silver bullet.
I elaborate below. . . .
First, as always when I am floating some new idea on my own, I stress that this is my personal blog that I had long before I came to Public Knowledge, I don’t run this stuff past my employer first, and ideas expressed here are my own. Public Knowledge has no position on any of this stuff.
How Would This “Privacy Polution Tax” Work?
First, go read this paper from Professor Omri Ben-Shahar called “Data Pollution.” Finished? Good.
You may recall awhile ago I wrote about Professor Jack Balkin and his “Information Fiduciaries” framework for privacy. As I explained in that post, frameworks are very important for how we create law and develop policy. Human beings generally think in terms of analogies and narratives. We tend to learn trough experience and usually try to put new facts into familiar storage buckets and try to solve problems based on the way we or others solved similar problems. This is why bad analogies make so much bad policy, because they tend to gloss over critical differences (e.g., spectrum is not “property,” even if we find market-based mechanisms similar to the way we buy and sell other sorts of intangible rights useful. When we treat spectrum as if it were actual physical property, we run into trouble.) On the other hand, good analogies are not merely helpful in understanding, they can guide us to valuable insights and solutions.
In 2017, Ben-Shahar proposed thinking of privacy as a pollution problem. He reasoned thusly. The big problem of pollution is that it is a byproduct of lawful activity that has extremely negative impacts both on individuals (e.g., increased risk of lung cancer from air pollution) and degrades public goods (e.g., release of green house gases causes global warming). For a long time, we struggled to make control of these negative consequences, encourage development of processes that created less pollution, and discouraged unnecessary production of more pollution, But traditional tools such as contract law or tort law had problems. The impacts are often generalized and hard to prove on an individual basis — so tort actions like nuisance are very hard to prove. Nor does anything require the polluting companies to enter into any kind of contract with impacted parties. The inability of individuals to locate most sources of unwanted pollution and the high cost of identifying and trying to pressure polluters to stop polluting, and the necessity for services that create pollution (such as power generation) made it super hard to reduce via informal means.
The problem of the data driven economy is similar. Unrestrained collection and use of personal information can have significant negative consequences for individuals and for society as a whole. These range from unwanted targeted advertising to enhanced risk of identity theft to an ability to manipulate elections and public debate by developing targeted disinformation. But the same problem of restraining the “data polluters” with traditional causes of actions arise. these problems are exacerbated if we think of data as something like property that the individual can sell or otherwise dispose of. Setting aside whether consumer consent even means anything real in the existing environment, or whether we should place the burden on consumers to try to protect their own data when this is effectively impossible, my consenting harms people not a position to do anything about it in a wide variety of ways. On an individual basis, lots of “my” information isn’t just about me, and my consenting to its collection and use harms those who also have a stake in the relevant information. Additionally, the aggregate of all information released, even if we limit to all “genuinely” consented to personal information, creates public harms like the ability to manipulate elections. Finally, even if I willingly consent, I have little recourse if you violate the terms of our agreement — especially if that violation is through negligence that permits a data breach. Indeed, we can think of data breaches as the toxic spills of the information environment.
OK, So We think About It As Pollution. How Does That Help?
When we start thinking of it as pollution, we both recognize the limitations of certain types of solutions (e.g., requiring affirmative opt in consent) and have a new set of tools to consider based on our experience with trying to control pollution. Sometimes this involves banning certain types of activities, or phasing them out over time. There is also a political economy question of how the companies that benefit from information extraction and processing, like those that benefit from extraction and processing of carbon (such as coil or petroleum) and those who benefit from using the end products of this extraction and processing, lobby hard against any change in the regulatory regime.
Which brings us to everyone’s favorite market-based mechanism for controlling carbon emissions, the carbon tax. This idea goes back to an economist named Arthur Pigou in the 1920s. Pigou developed the idea that you could control negative externalities (costs not included in the market price) by increasing the cost to the generator of the negative externality via targeted taxes. We call this, no surprise, a Pigovian Tax. The idea is to force the creator of the negative externality to internalize the actual cost to society caused by their activity. If the activity is still valuable at the tax-increased price, there will still be a market for the good, but the potential profit will reflect the true cost not merely to the producer, but to society as a whole. The tax, in turn, can be targeted to remedying the remaining social harm (or not). This is related to the idea of a “sin tax,” where we directly try to discourage activities we consider “bad” by artificially raising the price on them. For example, we discourage smoking by having cigarette taxes that artificially raise the price of cigarettes. Eon 101 tells us that, depending on flexibility of demand (and other factors), raising the price discourages consumption. Raise the price of a pack of cigarettes by a huge amount (like $2-3/pack) and at least some people will try to reduce their smoking or eliminate it altogether. Not everyone, of course, but the impact is significant.
This is the basic idea of a carbon tax, without the additional Coasian gloss of “cap and trade.“ Make it more expensive to extract carbon producing materials, or to directly produce carbon emissions, and people will have incentive to find lower or non-carbon emitting alternatives. We can apply a similar approach to data pollution.
Wouldn’t It Be Better To Just Ban Data Collection or Other Bad Data Practices Altogether?
As with pollution, direct bans on collection of personal information or on using personal information (including things like targeted advertising) face problems. For one thing, certain types of data information collection and data processing have positive impacts, or are necessary for broad swaths of the economy. To fundamentally change these complex systems, we need to use a combination of tools. I certainly think we could ban some activities, such as targeted advertising, without destroying things like free Facebook or Google. But we need to acknowledge that straight up bans that go after the bad stuff but permit the good stuff — unless they are narrowly targeted — quickly become very complicated both to write in legislative language and to implement. Yes, it’s often worth the effort (depending on the nature of the bad conduct), but we need to acknowledge it’s not so easy that it should be our only solution.
Additionally, we have the real political problem. I like to think of myself as pragmatic in the good way. That means slowly but steadily making progress and not just settling for crumbs in the long slow march to total deregulation. This also means I don’t think whatever we manage to get done today is the only thing we will ever do. Pigovian taxes, and their cousins sin taxes, are a lot more politically feasible than outright bans. Additionally, by reducing demand and creating alternatives, these taxes reduce dependency on existing bad practices and make it easier to restrict them.
So How Would You Do A Pigovian Tax To Protect Privacy.
Because this is an experiment, let’s keep it simple and target the most clearly identifiable and damaging problem.
I propose a 30% tax on the direct sale of individual information (or individually identifiable info, or any of the dozens of ways we can define the type of info at issue — while these details are important to implementation they are not relevant to the general theory) to third parties, or on the indirect sale of individual information through targeted advertising.
To be clear this is not a tax on collection of information. Everyone about to object because “I want to collect information for research, to serve you better, good stuff!” — you are not impacted by this. This proposal does not impact the ability to collect information. It simply taxes certain kinds of commercial transactions.
For the same reason, all objections that begin with “but I use the information to do product research/monitor for bad stuff/identify emerging trends/make aggregate predictions, etc.” you too are unaffected. For better or worse, you get to do all that stuff, because it is not selling the information to a third party, either directly or through targeted advertising.
Administration is fairly simple. It’s like any other sales tax. Enforcement may be more difficult, but tax evasion is a standard problem we deal with regularly.
The primary impacts will be on data brokers, and on targeted personal advertising. Please note that even a 30% tax is unlikely to totally eliminate these practices. But because these companies will pass on the additional cost to purchasers, it will discourage consumption of these services. Advertisers that perceive a lower marginal benefit to targeted advertising will still advertise online, but they will shift to more traditional types of advertising unless they feel the marginal benefit of targeted advertising exceeds the new price. As with so many other things in the digital platform world, squeezing the non-consumer facing side of the platform. If Apple can extract a 30% tax on all app store sales and in app purchases, I expect We The People can extract a 30% tax on sale of information to third parties and targeted advertising without totally killing the market and upending the Internet economy, etc.
Most importantly, it is likely to discourage the rash of collection of personal information by just about every product you purchase or service to which you subscribe. Right now, collection of information is ridiculously easy for every single connected device, which is just about every device these days. Because both the collection of this information is both dirt cheap and easy to do, and has zero negative consequences, it has become an attractive way to make some extra coin for everyone from your car manufacturer to your smart sex-toy company. Once you’ve collected it, why not just sell it?
This is the classic case for a Pigovian tax. I don’t want my devices spying on me and selling my personal info. As I like to say “exploitation is not a ‘business model.'” If we jack of the price of selling this info by 30%, Econ 101 tells us that fewer people will buy it, so there will be less incentive to sell it. If you really need to collect information from my sex toys “to serve me better,” you can still do that. But at least we will discourage you from doing so because “hey, why not?”
This is also why I want to take this a step further than just targeted ads and include any sale of personal information to third parties. Data brokers and the whole industry that has developed just for collecting, buying and processing information goes well beyond targeted ads (I think at this point I’m obligated by statute to hat tip Shoshana Zuboff’s Surveillance Economy, the book everyone likes to pretend they read all the way through but really just skimmed because in our current ADHD Age no one is gonna read anything that big — but we all wish we did and eventually we will all believe we did. Kind of like the way everyone pretends they read Wealth of Nations). In many ways, I find the practices of these data brokers, and the various uses to which they facilitate seriously unsavory uses of personal information, far more pernicious than targeted advertising.
Anything Else?
A tax on the sale of personal information in this manner (which I will henceforth call the “Privacy Pollution tax”), has additional advantages. One problem in enforcement is agency capture and political pressure from Congress at the behest of powerful industries not to enforce. While I regard this problem as highly over-stated (see above re: Paul Roemer’s op ed), it is certainly real and should not be ignored. For example, FCC Chair Ajit Pai has simply refused to enforce existing privacy law against mobile phone carriers, even when they do clearly illegal things like sell enhanced 911 geolocation data to bounty hunters. If we had an additional tax on such sales, then carriers would have to (a) raise the price, discouraging bounty hunters and stalkers from using this data; and (b) would have to report what percentage of their revenue comes from the sale of personal information, which would make it a lot harder to deny and would allow Congress to check on whether carriers are keeping their promises to stop selling this information. But best of all, a tax would totally bypass the problem of “what happens when the agency chairman goes from industry watchdog to industry lap dog” and shift enforcement to the IRS, which is much less likely to be captured by the industry and much more eager to enforce laws that generate revenue.
And, of course, we will actually create a new revenue stream for the government. This is one of the most attractive things about sin taxes from a political perspective. It makes it easier to do something government finds hard (raising taxes) by targeting something people generally don’t like. It is the government version of doing well by doing good. We could certainly link this revenue to funding broadband deployment and subsidizing broadband access for the poor. This would make more sense than any other “tax Silicon Valley to solve the digital divide” proposal I’ve heard. There is precedent for this. A lot of states use lottery money or taxes from legal gambling to finance education, for example.
On the other hand, Free Press’ proposal to use the revenue to fun independent journalism also makes a good deal of sense. I’ve written extensively about how public subsidies — properly structured to avoid government influence over the content — are a proven traditional way of supporting independent journalism. But even if it just goes to the general Treasury, it is likely to generate a boat-load of income in a fairly simple and straightforward manner.
Finally, there’s the First Amendment stuff. Taxes generally don’t raise First Amendment issues unless they are clearly content based or punitive to certain types of speech. I can tax the sale of books, the sale and rental of video tapes or DVDs, even the sale of services such as cable or satellite TV without triggering a First Amendment issue. Why? Because I’m not taxing the speech, I’m taxing an economic transaction of selling goods and/or services. See Leathers v. Medlock. The fact that the good in question is speech related is irrelevant.
But even if we found a First Amendment interest here, that is not a show stopper as I explain at considerable length in Chapter 5 of my book “The Case for the Digital Platform Act” (available for free download here) and for reasons I talked about in this blog post, not everything that raises a First Amendment interest means game over. Here, I expect we would clearly apply the commercial speech test, and would find that we meet the burden of compelling government interest and content neutral. This gets us out of the whole Sorell v. IMS Health issue, because I’m not planning on having any content-based objections. You sell, you pay. Simple.
Conclusion
At this stage, I’m just tossing the idea out for discussion. The idea of using taxes to discourage behaviors we don’t like is well established. Even if we have a narrowly defined class of activities for the Privacy Pollution tax, it will still mean a positive step forward. At the same time, I recognize that a bunch of details, like how exactly to draft such a law, need to be worked out. And I do not believe that I can cover every argument or potential complication in a mere 3000 word blog post.
Still, I think the Ben-Shahar suggested framework of viewing privacy violations like pollution works, and that makes a Pigovian tax approach a natural fit. Again, I’m not saying this is THE answer, in the same way no one thinks a carbon tax alone is THE answer to our carbon emissions problem. But it seems to me worth trying. If nothing else, perhaps people will stop making dumb proposals about somehow taxing services we like and have positive social uses, such as streaming video, and switch to taxing things we hate, like selling our personal information to data brokers.
Stay tuned . . .
Thanks Harold.
Great minds …. https://www.freepress.net/sites/default/files/2019-02/Beyond-Fixing-Facebook-Final_0.pdf