There are those still clinging to the desperate hope that somehow critics of Comcast’s “network management” policy of “delaying” BitTorrent packets “only during peak congestion periods” will be discredited. These folks have therefore wasted much time and energy calling those of us who filed the Comcast complaint all manner of nasty names, made sneering and condescending comments about Robert Topolski’s qualifications and the accuracy of his tests, and generally behaved like total obnoxious gits. So you will forgive me if I once again channel my “inner Cartman” and provide these folks with some bad news.
The Max Planck Institute for Software Development (MPI) has just released major study showing that Comcast and Cox Cable engage in major blocking of BitTorrent traffic regardless of network congestion levels, Robert Topolski and Jon Peha are right, and George Ou needs to shut the [bleep] up with his pathetic whining. Oh yes, and Ou also needs to get over his belief expressed at the Stanford FCC hearing that the reset packets could be coming from some mysterious source other than Comcast. Unless George is going to express a belief in the “reset packet faerie,” who sprinkles forged reset packets over good little networks to keep them safe from bandwidth hogs (which explains why this constant “leakage” only happens to Comcast and Cox), it’s time to face the reality that Comcast (and apparently Cox as well) really are using forged reset packets, deliberately, and all the time, just like we said they were.
Knowing, however, that folks like Ou (and paid flacks such as my friend and sparing partner Scott Cleland) are as incapable of admitting error as a certain Decider-In-Chief, I eagerly await the whacky weasel words that will inevitably follow. Will it be hand-waving technobabble? Ad Hominem attacks, cheap rhetorical tricks, and endless hair-splitting about definitions or ‘what I actually said was blah blah blah’? An effort to brush past this by proclaiming “this was never really about whether there were WMBs (weapons of mass BitTorrent blockage), this was about freeing the good customers of Comcast from the oppression of Al Qeda bandwidth hogs that use 90% of the capacity?” Another “expert study” that tries to cast doubt on Max Planck Institute (MPI) study? Or perhaps some delightful combination of all of these? The heat will be on!
A bit more analysis and a lot more snarkiness below . . . .
The rather prestigious, internationally renown, and extremely well qualified Max Planck Institute for Software Development has just released a devastating study of BitTorrent blocking via reset packets (the method identified by Topolski as that used by Comcast). The Institute also releases an open source tool so that individuals can test for themselves, replicate the results, and evaluate the study and methodology overall.
The conclusion? BitTorrent blocking via reset packets is only practiced on a measurable scale by three ISPs in the entire world! Those three ISPs are (drum roll puhlee-aze…):
The study notes that no DSL providers engage in this practice, probably because their networks are not as crappy as cable networks. No doubt Verizon and AT&T are quietly gloating and sending emails to Vuze demanding an apology.
The study also found that blocking rates remained high and reasonably constant despite the time of day. This certainly refutes the oft-repeated claim by Comcast that they only “occasionally delay” BitTorrent uploads during periods of “peak congestion.” It is simply the consistent policy of these three cable broadband providers (and no one else) to block/degrade (excuse me, “delay”) BitTorrent traffic in this fashion. (To be fair, I suppose Comcast and Cox could have such crappy broadband networks that they are always suffering “peak capacity,” but that is not what we usually mean by “peak”).
Let me stress that this does not effect the debate about policy, i.e., whether it is a good idea to allow ISPs to do this (I think no), or whether the Federal government (either through the FCC or in some other way) should make the a rule about this (I think yes). Folks have advanced numerous policy reasons why trying to prevent behavior like this is a bad idea that does more harm than good, and these policy arguments stand or fall on their merits. But the MPI study should definitively end the debate about whether blocking BitTorrent traffic through the use of forged reset packets is happening (it is), whether this practice is widely accepted in the Internet community as a “reasonable network management” technique (it’s not — score one for Jon Peha) and whether the practice is really limited to times of peak congestion (it’s not).
Of course, when I say “should,” I mean: “should in a rational world where people care about evidence and stuff and do not reject studies by highly reputable institutions with no obvious ax to grind without some kind of evidence to back up their objections.” As we all know, we do not live in such a world. And in DC policy-land, where people are paid big bucks to be obstinate pricks impervious to reason, the notion that some silly real world study by neutral experts should even be seriously considered as “evidence” is frequently greeted with either condescending chuckles or the sort of explosive outrage usually displayed by B Movie villains advised by their henchman that the hero has once again miraculously survived the overly complex death trap.
So I expect the good folks at Comcast, their paid flacks, and their pack ‘o useful idiots to start cranking up the old noise machine. We can anticipate the following lines of attack:
1) Ignore the inconsistencies with the previous statements with broad statements and an effort to blow by the whole thing. e.g., “What really matters here is that Comcast is doing its best to protect its customers from bandwidth hogs, and is working with other companies to find ways to minimize any intrusion with the legitimate uses of the network.”
2) Ad Homminem attacks and attempts to undermine credibility. Happily, there is no obligation for these attacks to be credible, or even coherent. The point is simply to generate enough noise so that people unfamiliar with the issue will decide there must be something wrong with the other side, and to hopefully goad the opposition into wasting time in a mud-wrestling match they should ignore. e.g., “Max Planck was a physicist. It is obvious that an institute named for a physicist could not possibly have the computer science expertise to do these experiments. Furthermore, Germany has every incentive to try to mess up our capitalist system — don’t think they’ve forgotten WWII! — and therefore the data of this so called ”institute“ should be regarded as highly suspect and a socialist plot. Especially because anti-business groups that give support to Al Qeda, like Free Press and Moveon, obviously were involved somehow in a way I can’t prove but will continue to assert until you believe me.”
3) Blame it all on Kevin Martin and his “vendetta” against cable. Perhaps some of Comcast’s pet Republicans in Congress will write another letter! Given that a lot of them are likely to be looking for work next year, showing Comcast (and the rest of the cable industry) that you stay bought no matter what the evidence could be a real career booster.
4) Expert “studies”, hair splitting and other technobabble. A tried and true method that relies on the public’s short attention span, general inability to understand technology, and inability to actually assess the validity of most studies. This is, after all, why industry folks invest so much money in maintaining coin operated think tanks. Churn out a study or two that looks impressive and has an executive summary making ludicrous claims, get into arguments about what specific statements ‘really’ meant and try to bog things down in endless debates over minutiae, and have an expert or two babble something reassuring that makes no sense, and the worries of willing believers will be soothed and members of the public approaching this for the first time will feel their eyes glazing over. (Remember, if you are a megacorp, you do not need to convince the public you are right, you only have to convince them to go away.)
In any event, I look forward to the upcoming fun and games. It is always a pleasure to watch skilled professionals at work, and Comcast and the cable industry hire some of the smartest and talented folks I know.
Stay tuned . . . .
Harold, the ad hominem attacks you’re launching above are beneath you. Comcast, Cox, or any ISP that blocks BitTorrent is 100% justified in doing so, so long as a prohibition against bandwidth hogging, operation of servers, or P2P is disclosed in its Terms of Service.
I love it when Harold gets on his high-horse and does all the whining and name-calling about other people’s alleged whining and name-calling. It’s Freudian projection on a scale that’s impossible to ignore.
The complaint about “hand-waving technobabble” is especially devastating. God forbid we would try to base government regulatory policy on a factual foundation when there are so many emotional buttons to press.
At the risk of descending into the technical, let me suggest that responsible critics should to a study of the BitTorrent throughput over the networks that use the RST technique vs those that don’t, with some token effort to measure network congestion on some of the paths in question, using the technical thing called “ping tests” rather than assumptions about traffic and time of day. P2P typically runs unattended, so it doesn’t really care what time it is.
I think these tests will show that pruning of excess TCP connections with RST packets has the effect of improving P2P download speeds on the affected networks. The reasons for this are “technical” so I won’t try to explain them here.
Just one reasonable test, that’s all I ask. The Germans haven’t done it, and Max Planck was not actually a programmer.
Brett, isn’t that a policy argument which, as I observed, is still very much a live question to be debated on its merits?
And Richard, at some point is it not incumbent on the critics of these tests to actually describe what would be a satisfactory test? If you have a study design, why not put it out there?
It didn’t take long for your predictions to come true, did it?
1. There is a world of difference between “we sell you up to 6 Megabits download/s” and “we sell a theoretical maximum that you can’t reach, and if you try we will quietly disable it”.
2. There is a world of difference between “we didn’t do it” and “we do it because it is a good idea and an overall benefit”.
3. If some customers are bandwidth hogs, at least tell them. Terminate their service, even – but tell them why.
4. I offer the following observation: the management and staff are neither stupid, nor are they inexpert in their own networks. It stands to reason that, whatever their goals, their network management policies advance them. And that they would not just “try it: what the hell” with a major portion of their business model. They MUST have done internal studies and made measurements and observations of these changes, and weighted them against their goals, revenue and desires.
In re that last point: show us the money. Show us what you did and said at the time you planned the network management changes that you denied-yet-do. Not some glossy retrospective.
A business does these things with planning. What was the goal, what was the observed effect?
Fair enough Harold, I need to describe the test I’d like to see. Before I get to that, it’s interesting that the Germans are making wildly different claims about TCP pruning than Vuze has made, based the somewhat larger sample that Vuze has tested. Vuze found RSTs all over the place, but the Germans only found them in a dozen or so. I’d like to see some explanation for the discrepancy.
The main question on the table is whether the traffic shaping of P2P is dependent or independent of load. The side questions deal with the extent of the traffic shaping, and the effect of traffic shaping on other applications and users of the Internet. Bear in mind that the legitimate reason for traffic shaping on the Internet is to bring about per-user fairness, a feature that is sadly lacking from the Internet’s official design and therefore must be implemented by the ISPs.
The Vuze plug-in is a better tool, because it doesn’t just take a brief snapshot but collects data coninuously. It’s certainly possible that there is enough unattended P2P running on a cable segment at 3:00 AM to trigger shaping; dedicated seeding is a 24/7 task, after all, and the vast majority of upstream traffic on cable Internet is P2P.
So run for at least 24 hours, and provide me with RST rates. Plot these over the 24 hour period, and calibrate peaks and valleys with ping times.
Then show me how much actual P2P throughput the consumer can expect on the networks in question, at the peaks and valleys.
The ironic thing about this RST method of TCP pruning is that it benefits those who aren’t dedicated P2P seeds, which is to say, most people. And it even benefits P2P downloaders to reduce the B/W allocated to dedicated seeders.
So the statistics we want to see are throughput numbers, not RSTs or any other TCP statistics. P2P uses numerous TCP connections, and its performance largely insulated from small numbers of them.
At the end of the day, the issue that P2P poses for the Internet is a classic example of innovation’s effect on the incumbent infrastructure: BitTorrent and Gnutella are not IETF-defined RFC protocols, they’re “innovative” new examples of Internet generativity, to coin a buzzword. So doesn’t it stand to reason they would need to be managed by non-standard means?
Seems reasonable to me.
Incidentally, this is the second set of charges regarding Cox; the first charge said Cox blocked Craigslist, and that had to be withdrawn when the facts came out. I think Cox Block 2.0 will end the same way.
Richard:
This appears to me a reasonable test. Ideally, it should involve many testers unknown to each other in numerous markets on multiple 24-hr periods. I will recommend to people that we try to organize such a test. I do not think this invalidates other studies, and since user experience is a rather important part of the issue, studies on that demonstrate the impact of p2p “management” are equally valuable, but this would (assuming I understand the technology correctly) go to the question of whether this is a practice associated with times of peak congestion or not.
As for the other points:
1) Your argument that network operators need freedom to manage their networks in the face of a dynamic internet environment is, I think, the best argument against network neutrality. I believe the concern is outweighed by the harms that result from allowing network operators unfettered freedom to do as they will. This is one of these complicated policy judgments where reasonable people are going to disagree. Brett sees his business dying if he can’t manage his network without government regulation. I see too many other things, including my ability to blog at will without worry that Comcast will block the posts where I say nasty things about them, dying if we don’t have regulation.
2) There have been stories about Cox possibly blocking p2p for a few months now. A key difference between Cox and Comcast, however, is that Cox always reserved the right to block traffic (as opposed to Comcast, which only changed their ToS to explicitly reflect this after they got caught). So I’m both inclined to believe that they are using reset packets in this package and less inclined to be pissed at them for lying. I think this option should not be permitted to them, but that is different from actually lying to their customers about it.
The nice thing about the Vuze approach to network measurement is that their plugin is active whenever their version of BitTorrent is running, it simply reports its measurements to a data collection server, so it’s a close to real-world measurement tool. I’ve been trying to test my Comcast link with the German tool for 24 hours and I always get a “server busy” message and no testing.
I understand that the Sandvine equipment is very widely deployed by cable MSOs, so I’m very skeptical of any claims that only Comcast and Cox have it turned on. Sandvine sold a lot of that gear before the Toploski situation developed.
I agree that ISPs shouldn’t have an unfettered right to manipulate traffic any damn way they want, and I don’t buy the “private property” argument. Anybody selling a service to the public has an obligation to fully disclose terms of service and not to engage in unreasonable discrimination.
But as you say, defining what’s “reasonable” is a non-trivial exercise.
Harold writes:
“Your argument that network operators need freedom to manage their networks in the face of a dynamic internet environment is, I think, the best argument against network neutrality.”
Who says that this is “non-neutral?” As I have said many times before, throttling P2P actually KEEPS the network neutral by preventing hogging and by preventing exploitation of flaws in the protocols.
You also write:
“Brett sees his business dying if he can’t manage his network without government regulation. I see too many other things, including my ability to blog at will without worry that Comcast will block the posts where I say nasty things about them, dying if we don’t have regulation.”
Harold, the specific measures advocated by the “Save the Internet” alarmists would indeed put independent ISPs out of business, because they MANDATE that ISPs allow abuse of the network and PROHIBIT consumer-friendly terms and pricing. In the case of the Conyers bill, they also mandate that we give users very expensive services that would either make us unprofitable or multiply our costs.
As for your fears that your blog might be censored: we’ve never seen that sort of censorship by an Internet provider, or even anything close. Comcast isn’t censoring your blog or anyone else’s, despite your “kind” words to them above. Nor are they censoring the even nastier missives on sites such as Free Press and “Save the Internet” (which is really just Free Press under another name — possibly a way of hiding the fact that they are a 501(c)(3) and should not be lobbying).
It occurs to me that when they lack other work, there is a motivation among lobbyists to gin up bogeymen and then solicit money to try to kill them. I think that this is what is going on in the “net neutrality” arena. The lawyers and lobbyists who argue against media concentration, in particular, have lost ALL of their recent battles — and so, rather than admit that they’ve failed or finding other productive work, they are looking for new ones. The cost of living in DC is high, so they are fabricating issues related to the Internet so as to attempt to bring in new money and save their cushy, inside-the-Beltway jobs.
Net Neutrality: the Telecom Regulators’ Full Employment Law.
But seriously, I think Harold is sincere about this issue, even if some others aren’t.