This is the fourth blog post in a series on regulating digital platforms. A substantially similar version of this was published by my employer Public Knowledge. You can view the full series here. You can find the previous post in this series on Wetmachine here.
“Customer proprietary network information,” usually abbreviated as “CPNI,” refers to a very specific set of privacy regulations governing telecommunications providers (codified at 47 U.S.C. §222) and enforced by the Federal Communications Commission (FCC). But while CPNI provides some of the strongest consumer privacy protections in federal law, it also does much more than that. CPNI plays an important role in promoting competition for telecommunications services and for services that require access to the underlying telecommunications network — such as alarm services. To be clear, CPNI is neither a replacement for general privacy nor a substitute for competition policy. Rather, these rules prohibit telecommunications providers from taking advantage of their position as a two-sided platform. As explained below, CPNI prevents telecommunications carriers from using data that customers and competitors must disclose to the carrier for the system to work.
All of which brings us to our first concrete regulatory proposal for digital platforms. As I discuss below, the same concerns that prompted the FCC to invent CPNI rules in the 1980s and Congress to expand them in the 1990s apply to digital platforms today. First, because providers of potentially competing services must expose proprietary information to the platform for the service to work, platform operators can use their rivals’ proprietary information to offer competing services. If someone sells novelty toothbrushes through Amazon, Amazon can track if the product is selling well, and use that information to make its own competing toothbrushes.
Second, the platform operator can compromise consumer privacy without access to the content of the communication by harvesting all sorts of information about the communication and the customer generally. For example, If I’m a mobile phone platform or service, I can tell if you are calling your mother every day like a good child should, or if you are letting her sit all alone in the dark, and whether you are having a long conversation or just blowing her off with a 30-second call. Because while I know you are so busy up in college with all your important gaming and fraternity business, would it kill you to call the woman who carried you for nine months and nearly died giving birth to you? And no, a text does not count. What, you can’t actually take the time to call and have a real conversation? I can see by tracking your iPhone that you clearly have time to hang out at your fraternity with your friends and go see Teen Titans Go To The Movies five times this week, but you don’t have time to call your mother?
As you can see, both to protect consumer privacy and to promote competition and protect innovation, we should adopt a version of CPNI for digital platforms. And call your mother more often. I’m just saying.
Once again, before I dig into the substance, I warn readers that I do not intend to address either whether the regulation should apply exclusively to dominant platforms or what federal agency (if any) should enforce these regulations. Instead, in an utterly unheard of approach for Policyland, I want to delve into the substance of why we need real CPNI for digital platforms and what that would look like.
A History of CPNI and How It Differs From Traditional FTC Consumer Privacy Protection.
Anyone interested in the super-duper insanely long history of CPNI, and a somewhat shorter history of the evolution of privacy regulation in the United States and how CPNI differs from anything the Federal Trade Commission (FTC) does with regard to privacy, I refer you to my 2016 100-page white paper on CPNI and my 2017 much smaller paper on principles for effective privacy regulation. Anyone interested in supporting footnotes and arguments should refer to these sources, which will save me enormous amounts of time digging up all the links.
A Much Shorter Summary Relevant to Why We Should Come up With a Version of CPNI for Digital Platforms.
Consumer privacy in electronic communications predates the Communications Act of 1934. Specific provisions such as 47 U.S.C. §605 (prohibition on publishing or intercepting any electronic communication) and the Cable Privacy Act, codified at 47 U.S.C. §551, reflect traditional concerns that people and businesses need to have confidence that their private communications will remain genuinely private. Indeed, in some situations the very existence of a communication, let alone the address information or information about contents, can be either personally or commercially sensitive.
How to Compete With a Platform that Must Know Your Proprietary Information and Must Cooperate With the Competitor to Make Competition Happen?
In the 1970s and 1980s, in an apparently utterly unrelated trend, the FCC began opening up the traditional telephone network to competition on multiple levels. This included requiring incumbent local carriers to interconnect with rival carriers, deliver calls from a competing network to the incumbent’s customers (and vice versa), and generally let these competitors access customers on a carrier’s own physical network. In addition, in a set of orders called the “Computer Inquiries,” the FCC required the telephone companies to provide wholesale access to their networks for providers of “enhanced services.”
Whether or not the telephone company offered a competing “enhanced service,” nothing prevented the carrier from learning everything about the enhanced services offered over its networks and then offering its own competing services (with the additional ability to favor its own affiliated offering over that of the unaffiliated enhanced service provider). To take an example, suppose I want to start an alarm service that will send a signal to an alarm center and call the police or fire department if a burglar alarm or smoke alarm is triggered in the customer’s house. To do that, I have to have access to the customer’s phone wiring. I need to plug my system into the phone network, and have the phone network send the call to the alarm center when the alarm goes off. That is impossible without the cooperation of the phone company. Furthermore, in order to make the system work with the phone system, I not only have to reveal to the phone company that this telephone subscriber is an alarm service subscriber, but I also have to reveal to the telephone company all kinds of details about how my alarm technology works.
As an alarm company, I regard all this information as proprietary — and with good reason. The phone company can add up how many customers I (and any rival alarm companies) have, and determine whether or not there is sufficient demand to start their own alarm service. By learning the details about how my technology and network routing work, the phone company can easily replicate this for its own rival service. It can then use its knowledge of which customers are my customers to offer them special deals to leave my alarm service and sign up for their service. Indeed, the phone company doesn’t even have to wait for me to start serving the customer. Once I tell the phone company, “I need to connect a customer at this address using this phone number,” the phone company knows that this subscriber is interested in the service and can market directly to the customer even before I start providing service. Alternatively, if I am getting customer information to pull a customer away from the carrier (say, to transfer their phone service to my competing voice service), the carrier can reach out to the customer to try to prevent the customer from switching. (Search for “Verizon Retention Marketing” and “FCC order” to see some examples.)
This isn’t a question of dominant platforms exercising market power to force information out of rivals. If I refuse to reveal the proprietary network information, then the service will not work. On the other hand, a blanket “don’t reveal to anyone” rule doesn’t work either. As the FCC discovered, carriers are quite happy to refuse to provide necessary information to rivals in order to service customers in the name of protecting customer privacy.
The FCC created the precursor to the CPNI rules to address this issue. The rules prohibited a carrier from using the information revealed to it by another carrier or enhanced service provider if the new competitor had revealed that information in order to provide service to a carrier’s customer in the first place. Basically, the rules prevented a carrier like Verizon from acting on information provided by a competing carrier, like AT&T, or another business, like ADT Security, that relies on the phone carrier’s network. Additionally, a carrier is required to provide information to a competitor when so directed by the customer. In other words, if I tell the phone company, “I’ve decided to go with a competing alarm company, so give them my phone information so they can provide me with service,” the FCC required the phone company to honor my request.
This was only one part of a comprehensive regime of structural separation and other safeguards (ultimately including the breakup of AT&T) that helped to create a massive new industry in products and services that relied on connection to the phone system. (This included something called the “dial-up modem” that directly led to the modern internet, but that’s a story for another time.) It tends to get overlooked in importance compared to the Computer Inquiries, but it remains a critically important contributor to enabling competition through telecommunications networks.
The Consumer Privacy Aspect of CPNI.
In the early 1990s, Congress was busy working on what would ultimately become the Telecommunications Act of 1996, one of the landmark revisions of the Communications Act. A major focus of the Telecom Act was to codify (and in some cases significantly modify) the various pro-competitive regulations adopted by the FCC as part of the general introduction of competition into the phone network and associated markets which begun in the late 1960s/early 1970s. This included incorporating the basic rules for governing proprietary network information discussed above.
Then-Representative (now-Senator) Ed Markey — who combines both amazing tech-savvy with a deep commitment to consumer protection — added a very substantive consumer privacy piece to CPNI in the House of Representatives. The generic privacy piece in the Communications Act provided broad protection to the actual data transmitted (including the existence of the communication itself). But it did not stop telephone providers from collecting and exploiting all sorts of useful information about the customer or about the nature of the communication. While what we would now call “big data” was still in its primitive stages, Markey recognized that phone companies could use or sell lots of data relating to customers that customers would have no choice but to reveal to their phone company. Since telephone service is considered an essential service necessary to participate in society (i.e. a utility), Markey decided it was unfair to force people to allow the phone company to exploit their personal information as a requirement for getting telephone service. The House version therefore added a provision adding new privacy protections that covers specific types of information such as equipment, network configuration, and other types of “metadata.”
The provision governing CPNI underwent some modification in conference between the House and Senate to produce the current 47 U.S.C. §222 rule, “Privacy of Customer Information.” Notably, the final version expanded the consumer privacy piece to include all telecommunications (not merely telephone) and made other tweaks to create a broader and more unified statutory scheme governing all customer proprietary network information. The statute strikes a balance between consumer control of information, protecting proprietary information of competitors or potential competitors, and creating exceptions for the beneficial use of data for provision of the service, network upgrades, and certain types of research.
Many of the same issues that prompted the creation of CPNI exist in the platform space as well. In addition to the consumer privacy piece, many platforms rely on third-party content that must provide the platform with proprietary information to reach the customer. YouTube must “know” who views any specific content. Amazon must “know” who buys third-party products sold through its platform. If/when the platform decides to expand into content or product production, it can use all the proprietary information it has gathered from its potential rivals on the platform to use as market research.
Because these problems parallel the same problems faced in the telecommunications world, it is useful to break down the existing CPNI statute in greater detail and consider how we might apply these rules to digital platforms.
How Existing Telecom CPNI Protections Work.
Section 222 of the Communications Act of 1934 (as amended) begins with a general statement of responsibility of a carrier to protect “the confidentiality of any information” disclosed by customers, other telecommunications carriers, and equipment manufacturers. (Contrary to what carriers subject to Section 222 have argued in the last few years, this provision imposes a general duty and is not merely some sort of meaningless introductory section.) Note that “confidential information” is a broader category than either “proprietary information” or even the more specific “customer proprietary network information.”
Section 222(b) limits how a carrier can use “proprietary information” revealed to it by another carrier if the rival carrier revealed that information in order to provide a telecommunications service. The provision includes a very specific provision of prohibiting the receiving carrier from using the information “for its own marketing purposes.”
Section 222(c) focuses on consumer privacy, notably “customer proprietary network information.” The statute limits a carrier to using any CPNI collected from a customer to (a) providing the telecom service to which the customer subscribes; and, (b) anything else needed to provide the service. Section 222(h) defines CPNI as information that “relates to the quantity, technical configuration, type, destination, location, and amount of use of a telecommunications service subscribed to by any customer of a telecommunications carrier, and that is made available to the carrier by the customer solely by virtue of the carrier-customer relationship.” What precisely this covers, for example whether it covers things that folks in the privacy community call “personally identifiable information” (PII), was a matter of considerable debate during the 2016 broadband privacy rulemaking. It’s sufficient for our purposes to observe that it covers lots of technical information related to the use of the communications service which can be quite revealing and which does not generally fall into the category of PII.
The statute also includes a bunch of exceptions that answer most of the objections people tend to raise, like “how can you run a network without collecting information on the customer?”. Because of course nobody could possibly have thought of that question before, which is why our telephone system hasn’t worked since 1997. Oh wait, it does. Because the statute allows carriers to use the information for billing purposes, for provision of the service (including compliance with any terms of service), to protect the network (which would cover all your objections about how this would make cybersecurity or spam management impossible), and pretty much anything else you are likely to object to as making it somehow impossible to apply.
Additionally, the statute allows carriers to aggregate personal information and use or disclose the “aggregate information.” Again, we had a fine debate in 2016 over whether the concept of “aggregate information” includes “anonymized information” but let’s pass over that for now. What everyone agrees on is that what Section 222(h) defines as “aggregate information” can be collected and used by the carrier and can be divulged to others in the form of aggregate information.
Finally, Section 222(c)(2) requires that the carrier disclose a customer’s CPNI to “any person designated by the customer” if the customer submits a written request to the carrier ordering the disclosure of the information. This serves two purposes. First, as noted above, it requires a carrier to cooperate with a potential or actual competitor when requested by the customer. Additionally, it allows the customer to find out what CPNI the carrier has in its possession. If I request the carrier provide me with all CPNI about me in its possession, the carrier is required by law to comply.
Applying These Principles to Digital Platforms.
While it is clear that application of both the pro-competitive and strong consumer privacy rules of CPNI to platforms is a Good Idea™, this does not mean a simple cut and paste. As always, communications law provides useful history and concepts for digital platform regulation, but we need to be mindful of the very real differences between what carriers do and what platforms do — and how they do it. Telecommunications services are common carrier point-to-point transmission of data – without any change in the information by the carrier (see 47 U.S.C. §153(53)) — shared solely by the parties to the transmission. That’s the whole “expectation of privacy” thing the Supreme Court recognized in Katz v. United States. Platforms obviously work very differently in terms of everything from the nature of the services provided, the relationship with the customers, the technology used, the types of information collected, and a host of other factors. Additionally, given 20 years experience with CPNI, we could do a lot to clarify issues that have emerged, such as anonymization v. aggregation.
Accordingly, at this stage, rather than try to draft legislation on some foolish theory of false equivalency or meaningless slogan such as “leveling the playing field” (this is real life, not a sports event), let us consider what principles from CPNI we would want to apply to a hypothetical digital platform CPNI statute.
Distinguish between what the platform needs to know to provide service vs. what the platform “knows”
Human beings tend to anthropomorphize a lot. In plain English, we tend to project human traits onto things (including inanimate objects and even abstractions) and respond to these projections emotionally. That’s fine when dealing with your cat or with your car. In Policyland, this tendency to anthropomorphize frequently leads to really poor policy choices because people tend to think of companies such as Amazon or Google as people they either like or hate and decide policy issues accordingly. It also means that people tend to think of companies as having human-like “thinking” and decisionmaking processes. It permeates our language (including mine), and influences our thinking.
I raise that because one of the key insights of CPNI is to regulate the use of information rather than the collection of information. This recognizes that a network operator is not an individual human being that either “knows” something or doesn’t “know” it. It is quite possible to build systems that do not share information with each other, limiting access to individual information to those systems and purposes permitted by law and consistent with privacy. An example of this is the way Apple creates encryption for its iPhones that Apple cannot break without massively hacking the phone. Apple has set up the phone encryption so that the iPhone “knows” the user password in the sense that when the user enters the password the phone makes the appropriate functions or information accessible. But Apple doesn’t “know” the password because the iPhone is designed to prevent Apple from having any way to access that information. Mind you, as the security establishment keeps reminding Apple, they could also design the iPhone to provide them with access to this information regardless of whether the iPhone’s owner wants Apple to access the password. But once Apple makes and implements the design choice, it doesn’t matter what Apple “wants” going forward. No one at Apple can access the password without the customer sharing the password or totally hacking the phone.
Similarly, we can require companies to structure their networks so that they collect information necessary to provide the service (or provide other functions, such as targeted advertising) without the ability to share this information with other systems or with any actual persons. In fact, if you ask Facebook, Google, and most other companies that provide targeted advertising, they will tell you that’s what they do now (to some degree at least). Advertisers say, “I want you to target single mothers between the ages of 15-20 who are left-handed” and these companies go, “no problem.” The ads appear, but the advertiser doesn’t need to know the names of the people who saw the ad.
What worries some about the information collected is: (a) purpose, if you don’t like targeted advertising; (b) the level of information that is available to the advertisers — such as geographic location; but, most importantly, (c) that it is entirely up to whoever is collecting the information how to use it. Yes, it is understood that companies are ‘bound by terms of service’ and the FTC can enforce adherence to those terms of service… blah blah blah. Companies routinely create privacy policies that give them virtually unfettered discretion, coupled with the ability to change the terms of service at any time. CPNI imposes enforceable limits on how companies use the information they collect, limiting these uses to those we decide as a country are consistent with the public interest. Put more simply, CPNI doesn’t simply prevent “disclosure.” It prevents those subject to the CPNI rules from “knowing” information collected — except for the express and permissible purposes collected.
Limit the ability to collect information from third-party providers of content or services that use the platform to reach customers
As I stressed above, one of the critical aspects of CPNI is one of the most overlooked. CPNI affirmatively promotes competition by protecting the information that potential competitors must provide to the carrier in order to reach the carrier’s subscriber and provide the service. This does not prevent the carrier from “knowing” the information for acceptable purposes. For example, if the carrier is collecting the money for the third party by putting the fee on the subscriber’s bill, the carrier certainly “knows” all the information necessary to collect from the customer and remit to the third-party provider. But it cannot use that information for any other purpose.
Recently, a spate of articles has observed that Amazon uses the information it collects as part of its third-party vendor program to develop its own line of competing products. Application of a CPNI regime would prevent Amazon (or any other digital platform) from using the information collected to promote its own products or unfairly compete with third parties using its own platform. And, to anticipate the usual objections, nothing about this would prevent the digital platform from policing third-party content, products, or services to prevent anything dangerous, nasty, inappropriate, or contrary to the policy of the platform. Again, the critical distinction is between the information the platform collects and can only use for limited purposes (what the platform “knows”) vs. allowing the platform unlimited discretion to use the information it collects (what the platform “knows it knows”).
Give consumers control over their personal information by allowing them to control disclosure
CPNI requires companies to disclose information to anyone when directed in writing by the customer to do so. This both protects consumers and promotes competition. In the digital platform arena, two ideas have long had currency. First, the idea of allowing consumers to do an “information audit” or “privacy audit” of companies to determine what information about themselves the company stores. Second, requiring digital platforms to make the social graph of individual users portable to competing platforms as a means of helping competitors overcome the advantages of the dominant digital platform’s mammoth customer base. (In theory, allowing customers to freely move their social graph from one platform to another (in an easily usable form) lowers the switching cost and provides necessary data for the rival platform so it can connect you in the same way as the existing platform).
CPNI potentially enables both these concepts. In doing so, we must recognize that calling it CPNI does not magically solve the problem of implementation. Traditional CPNI primarily involves information about myself. By definition, a social graph includes information about others — who may not consent to my sharing that information. Additionally, the common carrier/telecommunications universe made it relatively easy to create a common set of standards that makes porting information from one provider to another a straightforward process.
Nevertheless, we should embrace the idea that strong CPNI for digital platforms should include both the ability to compel disclosure of data collected from me not only to myself, but to any third party I explicitly direct the platform. As with CPNI in the telecom universe, this will both protect user privacy and promote competition.
Create a general expectation of privacy and narrow list of exceptions, rather than a general expectation of use with a narrow list of privacy protections
Finally, the CPNI statute largely avoids the confusing and artificial distinction made between “sensitive” and “non-sensitive” information. As I explain at great length in my Privacy Principles white paper, this is an entirely artificial distinction the FTC made up due to the limits of its authority under the Federal Trade Commission Act. Companies, of course, love it, because it limits even further the privacy protections enjoyed by consumers. Only some artificial class of “sensitive” information is protected by a requirement to opt in to permission to use, with the vastly larger class of “non-sensitive” information collectible and usable by default unless the consumer affirmatively opts-out.
CPNI reverses the presumption that personal information is generally collectable and usable if the platform has some sort of disclosure and opt-out option, subject to narrow exceptions requiring affirmative opt-in. Instead (for information covered by 47 U.S.C. 222, as well as information covered by the general privacy protection of 47 U.S.C. 605) CPNI imposes a general presumption that information may not be used, even for internal purposes, without the express opt-in of the subscriber — subject to very specific and narrow exceptions specified in the statute. The result is a far more rigorous and robust privacy regime that is oriented toward protecting the privacy of the consumer rather than maximizing the flexibility of the platform.
Conclusion
CPNI is much more than privacy for telecom providers. CPNI is a detailed regulatory regime designed to promote competition and to prioritize protection of consumers rather than maximize the flexibility of platforms. It does not cover all aspects of privacy protection, nor is it the sole means of promoting competition. But just as CPNI proved a very effective compliment to the FCC’s overall efforts to promote competition and the other privacy provisions in the Communications Act, application of CPNI to digital platforms will significantly advance both the cause of personal privacy and the ability of platform competitors to reach customers on the platform.
The history of CPNI demonstrates that the technological sophistication and complexity of digital networks can be designed to foster and protect privacy as well as to enable information collection and predictive analytics. We can, as a society, choose how to limit the ways that platforms use the data they collect from us. Certainly, we should be mindful of the implications of whatever policies we choose. But the 20+ year history of CPNI demonstrates that we can create a vigorous set of privacy rules that simultaneously protects personal privacy and promotes competition, without imposing any undue burden on innovation or provision of service.
Stay tuned . . .