Those interested in a great eye witness account of what happened at the FCC hearing in Boston on February 25 should read fellow Wetmachiner John Sundman’s piece on the part he saw (including the reception afterwards). But after listening to the FCC’s video archive, reading the statements, and reading the coverage, I’m willing to read the Boston Tea Leaves and see where we are so far and how I think this ends up.
Speculation below . . . .
Short Version:
Even Copps and Adelstein remain cautious about going to full rules at this early stage (and, I would add, under this Administration). They would rather expand the policy statement to include a non-discrimination principle, build a stronger case, and wait for hopefully better days in 2009. I must hasten to add this is not a criticism. I think they are both, in their own way, being as aggressive as possible in moving the ball forward (as demonstrated by their standing firm on the AT&T/BS merger conditions in ’06).
McDowell, true to form, fears government intervention more than the exercise of market power. Tellingly, however, even he gave a nod to consumer protection principles and did not question the FCC’s authority to act if necessary — contenting himself to questioning the wisdom of government intervention and citing the Congressional statement of policy in Section 230 of the Telecommunications Act of 1996 as if non-interference were the Prime Directive given by the United Federation of Planets to the FCC. (See, I told you this policy stuff matters.) His questions about developers labeling applications (more on this below) demonstrate a desire to equitably apportion the burden of consumer protection as McDowell apparently sees it. Should developers have responsibilities (and if so, can the FCC impose them) or does the responsibility for handling any potential problems caused by subscribers using an application fall squarely on the ISP? And if the responsibility falls solely on the ISP, how much should the FCC “tie the hands” of the broadband access provider? As it happens, for reasons I will explain below, I think McDowell has the wrong framework and, as a consequence, is going in the wrong direction.
Tate, meanwhile, introduced a whole bunch of new stuff that shows the AT&T folks have been bending her ear about how ISP filtering can improve everyone’s lives. I don’t have much to say about that, other than I think it would have catastrophic impact on just about every aspect of the internet economy, ultimately rebounding to hurt the content companies themselves. But that is such a lengthy discussion that it needs a post of its own, and others (such as the folks at Public Knowledge) have made the argument before.
Finally, we come to Kevin Martin, who is very slowly revealing his hand on this. As usual, Martin measures his words carefully. But his statement and questions show him heading toward a broad assertion of authority and a narrow application to the facts. Specifically, he gives every indication of wanting to address transparency and ensure that subscribers ‘get what they pay for’ and ‘know the rules.’ At the same time, he does not appear to regard this as a widespread problem — yet. Certainly he invited Marvin Ammori and Tim Wu to elaborate on why disclosure is not enough. But he also heard from Yoo on the perils of regulating — notably Yoo’s argument that network neutrality will raise the cost of providing service. And, for all that I believe Martin differs from true neo-cons like Powell and McDowell in recognizing the possibility of market power and that limited competition may not be enough, Martin still has the Republican reluctance to regulate — particularly prophylacticly.
While it is still too early to tell, my gut feeling is that Martin is leaning toward a broad assertion of FCC authority, sanctioning (or at least censuring) Comcast for their deceptive conduct in trying to hide the truth from their customers, and giving a stern warning to all other broadband access providers that the FCC will take a dim view of targeting particular applications. Possibly Martin might even propose something stronger in the way of guidance on what constitutes “reasonable network management” in the same way the FCC provided general guidance but few specific rules for the “open access/wireless Carterfone” C Block condition. But I expect rejection of Copps’ fifth principle and Adelstein’s Declaration of Independence as too soon/not supported by the facts/record too mixed to say anything for certain.
Depending on the language and the fine/censure to Comcast, such an approach would probably find a middle ground between the Rs and the Ds. The Rs would concur with reluctance to fine Comcast when (they would say) notice was ambiguous, while the Ds would concur but say the Order did not go nearly far enough and the FCC will now be mired in endless case-by-case adjudications when it is obvious we need a clear rule.
Long Version
Yes, of course there’s a longer version. You think I can limit myself when there are so many subtle nuances to discuss here? But for those in a rush, the summary above will do.
Panel I
I hesitate to start throwing around compliments for fear of offending by omission. So I will simply content myself with saying that I thought Copps, Adelstein and Markey were all brilliant in their own ways. Likewise, I must commend Marvin Ammori — lead counsel on our complaint against Comcast — for his brilliant presentation of our complaint and his ability to think on his feet. And do I really need to explain what an incredible asset to our team Tim Wu is? I didn’t think so. Nor do I omit Yochai Benkler, who rightly pointed out that the real problem was our lack of competition and we needed to return to policies that actually facilitated competition rather than rely on the “second best” policy of network neutrality and the hideous enforcement issues that come with it.
That said, I also applaud Chris Yoo for grace under fire and for scoring a number of good points. Yoo’s basic criticism (as I understand it from his presentation) is that efforts to impose a uniform rule on network neutrality will impose costs on systems and that this may prove fatal to broadband deployment where the expense of deployment already makes return marginal (such as in rural areas). God willing, if I have time, I will try to respond to Yoo’s arguments in a separate post. Suffice it to say that while I believe that the balance for public policy comes out in favor of network neutrality (or — as Yochai Benkler pointed out — real industrial policy that creates more broadband capacity and more competition), Yoo’s arguments are serious, substantive, and not to be dismissed with a wave of the hand. At the end of the day, however, public policy is trade offs. We may simply have to balance the potential benefits and harms of net neutrality v. giving broadband access providers free reign, and accept that we have costs either way.
I should observe that Yoo also expressed concern about the lack of transparency of Comcast’s behavior. So while Yoo is hardly out to regulate, he appears to recognize that allowing Comcast to act deceptively creates real issues.
I wish the MA Rep and the fellow from Vuze had more of a chance to talk about the practicalities. that’s always the problem with these hearings — there are a lot of people to hear from and not enough time. I think 3 panels, rather than two, might have been a better way to go, but it was a long day.
This brings us to Tom Tauke of Verizon and David Cohen of Comcast. Tauke did his job well under trying circumstances for him. He sought to distance himself from Comcast and the whole purpose of the hearing, seeking to make the whole business with text-codes and our Petition on text messaging are very different issues than the Comcast/BitTorrent fight. I would say he was generally successful, and thus avoided drawing any fire.
I also want to unpack his very lawyerly response at the end, when Martin asked if the FCC had the authority or if “Markey was right and we need legislation.” [Note how Martin framed this to give Tauke and Cohen the choice between FCC authority or legislation.] Tauke said “we see no reason to challenge the FCC’s assertion of authority.” Mind you, that is neither a yes or a no. What Tauke said was they didn’t need to challenge the FCC’s authority to enforce the policy statement because the Petition involving his issues doesn’t touch on the policy statement — it’s a straightforward question on how to classify text messages and short codes based on the Communications Act. So while he sounded like he was agreeing with Martin that the FCC has authority to enforce the policy statement, he didn’t actually concede the point.
Which brings us to David Cohen for Comcast. He was in a hard place and knew it. His response was the usual efforts of lawyers in hard places — to obfuscate the issue and hope for some allies. But even Yoo, who ideologically agrees with the broader issue that the FCC should not regulate in this area, cannot be described as a friend of Comcast willing to help bail him out when the FCC panelists pressed. Nor did any of the Commissioners seem inclined to bail Cohen out on the critical question of disclosure.
Cohen tried to cloud the issue by holding up Comcast’s recent change in its terms of service as proof of disclosure. But Martin made clear he was having none of it, inviting Marvin to explain that at the time of the offense Comcast not only did not provide even this minimal level of disclosure, but actively tried to deceive users as to whether they engaged in the practices identified by the AP. Nor did Cohen find much sympathy for his position that Comcast was merely “delaying” delivery of BitTorrent uploads.
The closest Cohen came to a friend was McDowell’s probe as to whether application providers had some kind of shared responsibility to disclose potential impacts of the applications on the network, and Tate’s hope that everyone can get together and work it all out. Both of these are — in my opinion — utterly wrong approaches to the issue.
Let’s take McDowell’s formulation. McDowell’s question betrays an understanding of application development that simply doesn’t exist. For McDowell’s plan to work, you’d need to (a) have applications developed like products available in stores or sold via network providers, and (b) have an understanding of the impact on networks. Neither is reflective of the reality. The line between applications, services, and individuals tweaking their own software and systems is extremely blurry. Especially in the case of free, open source software, the traditional concept of a developer and distributor that can place a label on the package just doesn’t hold. Nor can the people working on these projects adequately predict how they interact with networks, given that networks keep their traffic management operations and other necessary data a secret. Worse, unlike a piece of software I can design for either the Apple or MS operating system, I often have no idea what these programs or applications will do in the aggregate once unleashed on the world. That’s what makes this so innovative.
Consider a few examples. Network Address Translation (NAT) is an application (more like a bunch of different applications) that grew up at the edges of the network to handle a real time problem of IP address exhaustion. It allows you to take a single IP address and use it for a whole bunch of machines behind the NAT box. You can find a lot of people who dislike NAT because of the perceived impact it has on the end-to-end principle and on network security. It makes use of some protocols difficult, but people have various work arounds.
But there is no central developer or distributor that could somehow label NAT or describe its impact on broader networks and on users. For one thing, people couldn’t even begin to figure that out until they started deploying at the edge. When you got NAT deployment, you could start seeing impacts. Network administrators share this info around, but trying to explain it to users would be difficult to impossible. Nor could network operators employing NAT do so in a consistent manner.
Example 2, BitTorrent itself. BitTorrent was developed as an open source solution to overcome bottlenecks at the edge. Then businesses came along and started to tweak it and use it. And so did end users. But no one could predict its impact on networks and network congestion until it became more broadly available and the impact in the aggregate began to be felt. But even now, no one can say what the real impact is because the broadband access providers keep all this information secret and unverifiable. As the folks in the second panel noted, statements like “five percent of users create 95% of traffic, and all because of that evil BitTorrent” are unproven, unverifiable in the absence of additional disclosure or permission to conduct testing, and therefore should be highly suspect. But even if you believe folks like Comcast when they say such things, it is not possible to force application providers to “give notice” about it because there is no way the application or service providers can know it.
So that’s why I think McDowell is heading down the wrong track here, even if it seems reasonable to place “equal burdens” on application providers and network operators. What about Tate’s hope for commercial business deals and private sector efforts to resolve the issues? That’d be great, and when the internet universe was a lot smaller (like, say, 20 years ago), that’s how it worked. But ever since companies took the decisions on this out of the hands of the engineers and placed them in the hands of the business guys, you have problems with people negotiating based on their economic position rather than engineering. And the incentive of the network operators is to say “we own the customers you want to reach, why should we help you reach them when we can extort money from you?” It is, after all, the cable way. And Tate, after all, is capable of missing this exact point in cable-land, where it is even more blatant.
Final take away from the first panel: Pretty much everyone except Comcast thought that outright lying to your customers is wrong. Heck, we don’t let used car salesman tell customer something is false when it is true. We don’t allow supermarkets to change the expiration date on food products, even if that helps them “manage inventory.” There is not a single line of business where we protect false commercial speech and allow a business, when asked point blank by its customers, “do you do X?” to respond with “no, we never do X, and anyone who says we do X is spreading malicious rumors” when in fact the company does X. And despite Comcast’s best efforts to pretend that’s not what happened, it is pretty well documented that it is what happened. As the Virginia Supreme Court just reminded us, false or misleading commercial speech is not protected by the First Amendment. I suspect the FCC will also find that it is not a “reasonable network management” technique.
Panel II
The Second Panel was the technical panel, and has many valuable take ways. First, for all you whinny techno-libertarians that keep complaining that technical expertise is ignored — shut up. The idea that there is the one true engineering way that only engineers understand and all right-thinking engineers agree with is just bull crap, as demonstrated by FCC’s Panel II, in which a number of high-caliber technical people have a remarkable diversity of opinions.
For me, the most interesting testimony came from David Clark. Clark has opposed network neutrality regulation because he fears government imposing a one size fits all approach on network management, which he believes would be fatal to innovation. At the same time, however, Clark was clearly deeply troubled by Comcast’s actions, which he likened to that of a hacker trying to sabotage the network. Worse, Clark recognizes that if all the major networks decide to stop playing together and go their own way, the internet will pretty much cease functioning in the manner we have come to expect. that doesn’t mean Clark likes net neutrality now. He still hates a uniform mandate as far as I can judge. What he would really like is to roll this back to a time when these decisions were made by engineers based on their genuine perceptions of merit. But he also recognizes that ain’t gonna happen.
So for Clark, this appears to be an insolvable dilemma. Still, I think it was he who made the point that wherever one should draw the line between acceptable and unacceptable network management, Comcast clearly crossed it. This attitude was shared by most of the other panelists, with the exception of Richard Bennett. Bennett’s position was consistent with that taken on his blog. Comcast (and other providers) need to be able to freely experiment with network management techniques and cannot be forced to disclose them without rendering them vulnerable.
Bennett also tried to argue that the tests showing Comcast acted badly were deficient because the folks conducting the tests (the AP and EFF) failed to include information on network congestion. That earned a strong rebuke from Clark, David Reed, and others on the abysmal lack of data on internet traffic and how networks operate, and the utter unwillingness of companies to share that data even for research purposes. It is therefore absurd, observed Clark, to insist that those conducting the experiments provide data that was unobtainable and under the control of Comcast. (Bennett insisted that the relevant data on network congestion could have been obtained. Reed insisted from his own efforts to replicate the experiments that without the cooperation of Comcast, it could not have been obtained.)
Several FCC commissioners pursued this point, both with an eye toward what data they should collect and whether they had enough data to render a decision in this case. Clark gave a huge shout out to kc claffy at CAIDA as one of the true research giants in this area now driven to despair by the lack of available data.
Reed’s testimony provided a great deal of useful information, to me at least, on the technical aspects of this — including the nature of the testing and how it confirms Comcast’s bad conduct. Eric Klinker, for his part, talked of the “cat and mouse” game between networks and BitTorrent and the problems of trying to resolve this through negotiation. I regret that sufficient time has passed that I no longer remember with detail what Daniel Wietzner and Scott Smyers said.
Bottom line from this panel is that I think the Commissioners come away with plenty of support for the proposition that what Comcast did was wrong and bad for the internet overall because they acted unilaterally in a way that interfered with protocols everyone else relied upon. At the same time, they still don’t have a consensus around the idea of “reasonable network management.” Anyone familiar with common carriage regulation recognizes the difference between “reasonable” discrimination and “unreasonable” discrimination. The Commissioners want to prohibit “unreasonable” discrimination while permitting “reasonable” discrimination. But figuring out what the heck that is turns out to go beyond being an engineering problem.
Because another take away is that the operators will instruct their engineers to comply with the rules the FCC sets down. The question is predicting the impact of rules. There will clearly be some impact, but it is equally clear that the the “internet big juju, no touch internet or packet gods will get angry” line of argument is simply not going to fly. Worse for those who wish engineers could resolve all this, decisions about what constitute reasonable and unreasonable discrimination in the context of Title II common carriage were never strictly engineering decisions. These decisions always present complex questions of policy and economics and law, informed by engineering realities. But to pretend that it is only an engineering question, or should be, is nonsense. ‘Twas never thus, and never will be thus. Such claims reflect either nostalgia for a mythical earlier time or a deliberate effort to control the debate and force a particular outcome. Engineering matters, but so do other aspects of policy. Deal with it.
Final Take Away and Guesses On Final Time Line?
Well, we have no clue on what will happen with the text messaging and short codes case. As I mentioned above, I think the FCC will assert broad authority but ultimately resolve the case on narrow grounds. (Comcast will, of course, appeal — preferably to those deregulatory activists on the DC Circuit.) My personal guess based on the refusal of the FCC to grant an extension for reply comments, coupled with the general desire to get major work done before the election shuts things down, is that we are looking at the June-August time frame. OTOH, if it is contentious enough, it may wait until the flurry of activity after the election. I suspect that Martin would like to handle this one himself, rather than roll it over to his successor — especially if that turns out to be a Democrat.
But there are no guarantees in tea leaf reading, and a lot can happen between now and when the Commission renders a final decision. As we have seen over and over again, Comcast is not without its political allies, and is already back on its favorite tactic of trying to make this about Kevin Martin rather than about Comcast lying to its customers. It would not do to underestimate their tenacity and resources.
Of course, I can say the same thing about us in the public interest community as well.
Stay tuned . . . .
Harold,
I think your analysis is insightful, and consistent with what I saw and heard.
Let’s be clear, however, about David Reed’s point that to do what Comcast did you have to go deep into packets. I believe it was he who said that packets could be encrypted without violating the protocol. So where would we be then, if Bittorrent users decide to start encrypting their packets? The “integrity of the protocol” is an important concept, your points notwithstanding, and once we open up that can of worms, who knows where we may wind up. A balkanized Internet is a real possibility.
From my personal discussion with Martin, I get the sense that he is concerned with preserving the Internet as an engine of democracy. It has become vital to our discourse, and so there is a larger public interest at issue here than there is, say, in the matter of whether the NFL channel is available a la carte or in “basic cable” packages. When you step back and look at it that way, it seems to me, these issues assume a different shape, and the inherent conflict between Comcast (and other content companies) and me becomes apparent. They want to sell me content. I don’t want their damn content. I don’t even watch TV. I want to use the internet to communicate with people, largely through websites, including my own Wetmachine. Under the “Whiteacre tiering” plan that many such companies favor, they want to restrict my use of the Internet, and relegate me and you, and our readers, and the entire class of people like us, to second- or third-class net citizenship in the name of bringing me a better NFL-watching experience. And I’m telling ya, I DON’T CARE ABOUT A BETTER VIDEO OVER NET EXPERIENCE, EVEN IF IT FEATURES NAKED ALLYSON HANNIGAN! I’ll admit that naked Lisa De Leeuw videos present a more challenging case, but even there I come down on the side of net neutrality.
Finally, two nits:
1) Was Markey there? As noted, I missed most of the first panel, so I may have missed him. But he wasn’t listed on the program.
2) I edited your first sentence a little. I think that you, like Zoot in the Muppet Movie, “skipped a groove” in posting it.
WRT David Reed. Yes, I’d forgotten about that point. It is well taken.
As for the broader First Amendment issue: I assume you are saying this for the benefit of our readers, as you know I have preached that sermon many times. And yes, I agree with you that Martin gets this point. But I’m not sure it fits his frame of reference for regulation.
excellent summary, as usual.
correct me if I’m wrong, but there seems to be a much wider consensus on behavior is unacceptable (comcast, madison river) than on what rules would work prospectively.
Just to be devil’s advocate, are there advantages to the current limbo? In other words, is there some benefit in always dangling the regulatory knife without actually enacting new (and necessarily unpredictable) regulation. comcast and others may continue “behaving” b/c they fear a reaction.
All that said, I favor ex ante legislation/regulation (the AT&T merger language is fine by me). But still, despite my general disagreements with Yoo (and, frankly, with his NCTA subsidies), i think the uncertainty argument has always been the strongest argument. We honestly don’t know how these rules will interact with the network. Accordingly, maybe limbo is not so bad – particularly if coupled with more politically acceptable policy statements.
On the other hand, there is (i hope) a brief window approaching of a Democratic President with strong majorities. Now would be a good time to institutionalize these protections. Republicans are like the Terminator – they, um, return.
Publius, thanks.
I agree that the strongest argument for doing this by adjudication is the “it is so fluid and we are still so uncertain that we shouldn’t create a solid rule.” Indeed, in my FTC testimony last year and elsewhere, I have readily conceded how much we don’t know and that which way you decide here is more dependent on whom you fear more (corporations or government) given that uncertainty.
But I think the arguments for ex ante regulation are stronger than those for post hoc. For one thing, the uncertainty cuts both ways. Industry will continue to push the envelope until the FCC tells them they have crossed the line, at which time they will complain (like Comcast is) that the FCC is unfair for “surprising” them in such a fashion. Further, post hoc rules depend on discovering the conduct far more so than ex ante, because the company can always claim with legitimacy that it did not know the conduct it engaged in was prohibited. Please note that even here, where the system “worked,” it relied on the Associated Press confirming research after the initial concerns were publicized. We’ve had reason to believe the conduct was going on since August, certainty since October, and it is now March. The FCC may not decide until April or May. That’s a long time to wait for an ex post solution.
On the factual dispute over data on network congestion, here’s where Reed misses the point: the tool that captures TCP RST packets, Wireshark, also captures all the rest of the downstream packets on the Comcast network. This data is sufficient to show how heavily-loaded the network segment was at the time of the BitTorrent throttling.
Any network engineer knows this, and for Reed and Clark to assert otherwise shows they haven’t actually studied the Comcast network at even the most superficial level.
Other than that, your summary is pretty fair and balanced.
I need to clarify my comment. It is a fact that Wireshark doesn’t show any congestion that may occur any place beyond the immediate segment without a great deal of interpretation. But Comcast is mainly concerned about first-hop congestion, so it applies in this case. Reed is one of these guys who thinks the song is about the Internet per se, and therefore misses the problems that arise in managing the first mile.
I appreciate your comment and clarification on the technical issue.