A substantially similar version of this blog was published on the blog of my employer, Public Knowledge.
Last year, Public Knowledge and Roosevelt Institute published my book, The Case for the Digital Platform Act, I argued there that we could define digital platforms as a distinct sector of the economy, and that the structure of these businesses and the nature of the sector combined to encourage behaviors that create challenges for existing antitrust enforcement. In the absence of new laws and policies, the digital platform sector gives rise to “tipping points” where a single platform or small oligopoly of platforms can exercise control over a highly lucrative, difficult-to-replicate set of online businesses. For example, despite starting as an online bookseller with almost no customers in 1994, Amazon has grown to an online e-commerce behemoth controlling approximately 40% of all online sales in the United States and enjoying a market capitalization of $1.52 trillion. Google has grown from a scrappy little search engine in 1998 to dominate online search and online advertising — as well as creating the most popular mobile application system (Android) and web browser (Chrome).
Today, Public Knowledge released my new paper on digital platform regulation: Mind Your Own Business: Protecting Proprietary Third-Party Information from Digital Platforms. Briefly, this paper provides a solution to a specific competition problem that keeps coming up in the digital platform space. Continuing accusations against Amazon, Google, and other digital platforms that connect third-party vendors with customers, that these platforms appropriate proprietary data (such as sales information, customer demographics, or whether the vendor uses associated affiliate services such as Google Ads or Amazon Fulfillment Centers) and use this data collected for one purpose to privilege themselves at the expense of the vendor.
While I’ve blogged about this problem previously, the new paper provides a detailed analysis of the problem, why the market will not find a solution without policy intervention, and a model statute to solve the problem. Congress has only to pass the draft statute attached from the paper’s Appendix to take a significant step forward in promoting competition in the digital marketplace. For the benefit of folks just tuning in, here is a brief refresher and summary of the new material.
A side note. One of the things I’ve done in the paper and draft statute in Appendix A (Feld’s First Principle of Advocacy: Always make it as easy as possible for people to do what you want them to do) is to actually define, in statutory terms, a “digital platform.” Whatever happens with this specific regulatory proposal, this definition is something I hope people will pick up on and recycle. One of the challenges for regulating a specific sector is to actually define the sector. Most legislative efforts, however, think primarily in terms of “Google, Facebook, Amazon, maybe Apple and whoever else.” But digital platforms as a sector of the economy includes not just the biggest providers but the smallest and everything in between. With all due respect to Justice Potter Stewart, you can’t write legislation that defines the sorts of actors covered by the legislation as “I know it when I see it.”
More below . . .
Why Does This Problem Keep Happening?
In the physical world, intermediaries like supermarkets or retail stores are severely limited in what they can do to collect proprietary information and exploit their position as intermediary between the vendor and the customer. Take a supermarket chain that decides to create its own white-label brand of a popular cereal. First, the supermarket has to actually buy and store the inventory, so it has incentive to at least sell off the inventory of the “Name Brand” cereal it owns. The supermarket must provide some visible shelf space for the Name Brand product which is used to attract customers. Yes, it can provide better shelf space to its own product, but a buyer can still see the Name Brand product on the bottom shelf. Finally, although a supermarket chain will know how much product it purchases and sells through its own stores, it has no inside insight to the vast majority of the same product sold in other stores and has no means to control the channels of distribution of the Name Brand.
But digital platforms behave very differently. They often do not buy the product and therefore have no risk of unsold inventory if they undercut themselves, or if the product does not prove popular. Most of the costs of storage and logistics are pushed on to the vendor. For the platform to work, it must record every sale and track every delivery. The platform knows even better than the vendor the details of the customers’ behavior, such as what related products customers are likely to buy and at what times and in response to what sort of recommendations or advertising. And while a supermarket vendor can place a competing product on a lower, less convenient shelf, the buyer can still find it. By contrast, if the digital platform rigs the “buy window” or search results to favor affiliated products it can render the competing product practically invisible from the standard shopper.
This is not a question of prohibiting data collection, because without giving the platform access to the proprietary information, the platform cannot provide the functionalities that bring the vendor and the buyer together. While third parties can try to protect themselves with contracts, this is easier said than done. Once a platform becomes large enough so that the “cost of exclusion” from the platform becomes too much for a vendor to bear, the vendor has no choice but to agree to any alteration of the contract that gives the platform permission to take advantage of the proprietary information.
But even without dominance, platforms have incentives to harvest proprietary information that vendors must disclose for their own advantage. Discovering unfair exploitation of this proprietary information is exceedingly difficult in the absence of a law, so vendors will have difficulty discovering the harm until much too late to matter. The platform has control of all the relevant information. Without a law, a suspicious vendor has no recourse to even demand discovery, let alone demand any kind of restitution.
The Solution: A Law That Provides Protection for Proprietary Information
The first instinct is to wholly prohibit platforms from collecting third-party proprietary information. The problem is that unless the vendor exposes the information to the platform, the service doesn’t work. A vendor needs to let the platform see who buys the product using the platform, when they buy the product, and other valuable information relating to sales. The platform needs to know when the product is delivered and if the product is returned. There must be some mechanism by which the platform passes payment from the buyer to the vendor. All of this requires exposing proprietary information to the platform. Furthermore, as platforms offer new services, the kind of information they may need to collect simply to provide the service may change.
Rather than trying to directly regulate what information platforms collect, the law should regulate how platforms can use the information they have collected. The paper proposes a law that imposes on digital platforms an affirmative duty to protect third-party proprietary information and sharply limits the ability of platforms to use third party proprietary information for any reasons other than to provide the service for which the information is disclosed. So if a vendor gives a platform access to customer and sale data so that the platform can handle customer billing and product shipping, the platform cannot use the information for any other purpose — such as developing a rival product. The statute also prohibits the platform from using buyer information to which it has access to reverse engineer the vendor proprietary information.
This isn’t the first time regulatory authorities have faced the problem that would-be competitors must go through a rival platform to reach would-be customers. The Federal Communications Commission faced a similar problem in the 1970s and 1980s. Sellers of “enhanced services” through the phone system, such as alarm providers and voice mail services, needed access to the customer’s phone network to provide service. This meant providing detailed proprietary information about their customers, billing practices, and technology to the phone company, which could then promptly develop a rival product with the added advantage of a pre-existing relationship with the phone subscriber. The FCC therefore imposed a series of safeguards, known as the Customer Proprietary Network Information (CPNI) rules. Congress ultimately incorporated these into the Telecommunications Act of 1996 at 47 U.S.C. §222. You can see this blog post if you want background.
In my new paper, “Mind Your Own Business; Protecting Third-Party Proprietary Information on Digital Platforms,” I adapt the existing telecom CPNI for digital platforms. The keyword here is “adapt.” I’ve written often enough that while the history of regulating physical communications networks has a lot to teach us about regulated virtual communications networks such as digital platforms, these virtual networks are different enough from physical networks to make a huge difference in how to apply the same basic values and protections. Or, as I like to say, the proposed platform CPNI has the same relationship to existing telecom CPNI as West Side Story to Romeo and Juliette. Yes, it’s the same story, but no one is going to mistake the Broadway musical for the Elizabethan original. As a result, the draft “platform CPNI” statute (thoughtfully included as an Appendix for all you legislative drafters) has several significant differences from the existing telecom CPNI framework.
First, I have not attempted here to include protections for personal privacy. This is a throwback to the original CPNI rule developed by the FCC in the 80s, which focused exclusively on promoting competition and used other sources of statutory authority to protect consumers (you can read this history in my 2016 Report prepared for the FCC’s broadband privacy rulemaking). PK continues to advocate for vigorous privacy protections, and we continue to believe that strong personal privacy protections (when done right) also promote competition. But the purpose of this statute is solely to promote competition, and to move the ball in a rather controlled, incremental way as a first step. So we are not complicating the proposal by adding in consumer privacy.
Importantly, this is compatible with adopting strong consumer privacy protection. Opponents of strong privacy protection often argue that it will entrench incumbents and create huge new barriers to entry. This argument gets made quite a bit over the EU’s General Data Protection Regulation (GDPR), for example. Conversely, arguments against competition enhancing regulation often are opposed on the grounds that they will create significant privacy harms (this argument gets made often with regard to data portability). The reality is that while sometimes you do have trade-offs, careful drafting can eliminate or minimize these tradeoffs. Platform CPNI is an example of a competition enhancing regulation that is perfectly compatible with strong personal privacy protection.
Second, I have provided far greater limitations on the ability of a platform to use a third-party vendor platform CPNI than I impose on the consumer/buyer side. This reflects the different relationship and expectations held by members of the public who use digital platforms such as Amazon or eBay and vendors seeking to reach customers. Vendors would not expose the proprietary information but for the fact that they have no choice if they want to use the service. The proposal therefore prohibits the platform from using the information collected for any purpose other than facilitating the desired transaction. By contrast, members of the public have an ongoing relationship with the platform outside of the scope of the specific transaction with the specific vendor. To prevent anticompetitive conduct, while allowing the platform to reap the appropriate rewards of its investment and customer service, the restrictions on the use of “buyer” platform CPNI are primarily designed to prevent the platform from reverse engineering the vendor’s CPNI.
Third, I have again elided over who should enforce the statute or whether the statute should simply be self-enforcing through private rights of action. As I discuss in Chapter 8 of The Case for the Digital Platform Act, there are reasons to prefer locating the authority in an existing agency, or for creating a new comprehensive agency. Although we at Public Knowledge have generally advocated in favor of creation of a new, sector-specific agency, this statute is designed as a first step by Congress in sector-specific regulation for digital platforms. The statute is therefore designed to be compatible with whichever path Congress chooses.
Finally, in the definition section of the proposed statute, I have included a working definition of digital platforms in legislative language. I draw on the definition of “digital platforms” I used last year in my book, which focuses on those features that make digital platforms behave very differently from other types of businesses (including other businesses accessed via the Internet). I include it here so others may use it.
Digital Platform means a service that—
(i) is accessed via the internet;
(ii) provides a two-sided or multi-sided market where at least one side is open to the general public and allows the public to produce and interact with content; and
(iii) permits users to:
(1) simultaneously engage in multiple activities on the platform;
(2) interact directly, or in a generally unmoderated manner, with other users of the platform;
(3) allows users to self-organize into open or closed groups where users have freedom to share information, goods or services with each other.
Conclusion
In Europe as in the United States, we have increasingly recognized that antitrust alone cannot curb the market power of dominant digital platforms, or prevent the rise of new dominant platforms after the breakup of the old. The proposed “platform CPNI” regulation does not, on its own, solve the problem of introducing competition into the digital platform space. But it does offer a targeted and incremental approach to solving a pressing problem. This proposal makes an excellent first step for Congress to begin the lengthy process of promoting competition in one of the largest and most important sectors of the economy.