If I had a dime for every article I have seen since AOL went to flat rate back in 1996 that foretold the coming end of flat rate internet access plans and the inevitability of metered pricing, I’d have so much money I could actually afford what wireline providers dream of providing as a monthly fee. Despite the “inevitability” of metered pricing for nearly 15 years, it hasn’t happened and I don’t expect it any time soon. Why? Because not only is it wildly unpopular with the customers (it is one of the few things powerful enough to overcome the switching cost for anyone with a choice), but the economics of it do not make a heck of a lot of sense. Heck, Comcast (the largest residential broadband provider) announced in its earnings call on 4Q 09 that it is reducing its capital expenditure on network capacity for 2010 because it has nearly completed necessary upgrades for DOCSIS 3.0, which gives it all the capacity it needs for the foreseeable future. “We don’t need to invest anymore in our network because we have all the capacity we need” is a might inconsistent with “we need to switch to metered pricing so we can afford to expand our network capacity and create incentives against ‘bandwidth hogs’ and other mythical beasts.”
I can forgive wireline providers for indulging in metered pricing fantasies, while admiting them for perpetuating the useful myth og limited capacity to ward off regulation. But when this article on the purported inevitability of metering wireless plans. This strikes me as “Keep The Government Out of My Medicare” lunacy.
As the article itself concedes without saying directly, wireless broadband plans are already metered. Blow past your monthly usage cap and you will pay per-minute charges. For those not old enough to remember, this was the old AOL metered pricing model. You got ten hours for free, then got charged on a per-minute basis. They abandoned it because customers hated it and moved to flat rate price plans. So what wireless providers apparently mean by “metered” is “find a away to reduce the usage cap further by pretending to call it something else.” I expect this will not catch on any better than the efforts to change pricing structure on the wireline side, and for the same reason. The economics don’t make sense.
Which brings us to the next lesson on network economics. The cost structure of building and maintaining the network is marked by high fixed cost and low marginal cost. That is to say, the vast majority of cost comes from building the network itself, regardless of how many customers use it. Once the network is built, the actual marginal cost of each customer is fairly low. Even an intense user does not “consume” very much of the network resources (the supposed “bandwidth hog” is a problem only because network capacity is ridiculously oversold). The argument that the majority of subscribers subsidizes the few “bandwidth hogs” is simply rubbish. The question is simply how obscenely high a rate of return can the network operator squeeze out of each customer.
Back in the old days, we used to require providers to prove cost. Sure we had metered pricing, but that was so that the very profitable areas could subsidize the high cost areas. Nowadays, we rely on “the market” to regulate cost, with the result that profit per customer for the major providers continues to rise. I’m cynical enough to wonder if that’s why we see this endless parade of speeches by network operators and articles by their sycophants about the “inevitability” of metered pricing — so we will thank our lucky stars that when we are outrageously ripped off that it is at the “bargain” of overpriced flat rates.
Stay tuned . . .
“We rely upon the market to regulate costs” – precisely so. And the oligopoly does indeed regulate costs; they do so, however, in their own interest, not the public interest. Twas ever thus.
So what are the barriers to entry which permit Comcast to go on raising prices without inspiring actual competitors to go out and slice themselves a piece of this – and how can they be lowered?
Always worth repeating:
From 2007: “…nothing could be be more wasteful and self-destructive than to compel would-be new entrants to build additional redundant facilities platforms — each of which (that survive) will ultimately be capable of carrying many orders of magnitude more bandwidth than any individual or household could ever require, even into the distant (e.g., fully immersive HD VR) future. The colossal, Carl Sagan scale waste that will go into such redundant platforms will be the price we all pay for our collective failure to make sensible use of the political tools that democracy has provided us, to establish and police a sane competition policy.”
_20070803_002641.html” rel=”nofollow”>http://www.pbs.org/cringely…
And more recently:
“The myth of ‘facilities based competition’ cannot stand up to the observed fact that accelerating, technology-driven productivity gains in network capacity production (i.e., DWDM, et al.) have long outpaced the growth of observable and foreseeable human demand for network services, and continue to outpace those demands by ever-increasing orders of magnitude over time. My great fear is that by elevating the primacy of investment (even targeted investment in network capacity), as the summum bonum, we’ll be committing ourselves to a self-negating principle that is ultimately likely to yield a rich variety of discriminatory mechanisms and practices, but increasing less and less productive investment.”
https://www.blogger.com/com…
AOL was a narrowband service in which the only significant difference between one user and another was connect time.
In the broadband world there are huge differences between users and applications with respect to data volumes. This is important because ISPs pay their transit providers on the basis of volume, so the costs of running a network are volume sensitive. When you work at all the way to the equipment, the network needs to be provisioned for peak load, and that’s driven by volume.
Internet protocols reduce apparent demand down to capacity, so they make it difficult to see the actual demand except by reference to how long periods of peak load last.
Advances like DWDM are nice, but don’t do much for people who connect to their ISP over a copper wire, which is just about everybody. Where does the money come from to extent fiber into every home, school, government office, non-profit, and business? The answer to that question is private investment.
The bottom line is that there is never a time when the investment in networks is finished and the owner can sit back and clip coupons: they need to be operated and maintained, and it costs significantly more to operate a packet-switched network than a circuit-switched network. We’re still doing too much broadband over wires that were engineered for narrowband communication.
Wireless is a big game-changer as well. It’s certainly the case that new generations of mobile tech are created on a regular basis and then diffuse across the networks. With mobile moving away from the GSM vs. CDMA tech divide toward a uniform standard in LTE, we can begin to visualize scenarios in which mobile devices will obtain various services via multiple networks, perhaps without the user even knowing it. This opens up some exciting vistas for competition and the more effective use of multiple, pervasive broadband networks.
Metered pricing is a crude way of trying to keep costs in line with prices, we can certainly to a lot better, but it does address a real dynamic that didn’t exist in the past.
Hi Richard,
Your assumptions about the *relative* range of variance between light and heavy users in the narrowband vs. broadband days days are highly suspect, to put it nicely. If you can provide evidence that the gap is bigger and *more expensive* today, given the (conservative est.) single-digit $$/Mbps cost of traffic exchange that major Internet access providers are facing (vs. 3-4 digit $$/Mbps in the narrowband days), then perhaps it merits further consideration. Also, I’m sure you know that all large and growing Internet access providers (narrowband or broadband) face rolling expansion and O&M costs (?). However, if you still wish to claim that operation and maintenance costs today are really that much bigger, e.g., as a share of gross revenues collected for access services that are now priced appx. 3x-5x higher that they were in the narrowband days, then perhaps it would also be a good idea to reveal exactly what kind of accounting miracle has prevented broadband access providers from going bankrupt years ago?
Alternately, if we can set aside the foregoing O&M claims as basically immaterial, it sounds like you might agree with my earlier blogger.com remark to the effect that (as you describe it) “coming up with the money to extend the fiber into every home, school, government office, non-profit, and business” is basically a *one-off problem* — sort of like the one-time cost of building out the original copper PSTN facilities platform. I’m assuming here, given the fact that some incumbent facilities owners are publicly committed to destroying those legacy plant elements wherever circumstances permit (e.g., with each new FIOS activation), that the legacy plant is all paid for now? I wonder how long it’s been all paid off…
True, wireless is a big game-changer today, but only for those who weren’t playing any game at all in earlier years. There’s no denying that for those living in very rugged and/or very sparsely populated areas, and for late-developing Internet markets overseas, wireless is a godsend. For everyone else, it’s a very nice, mobility-enabling complement to the “primary,” means of fixed access that they enjoy at home and/or at work — AND in most cases it’s no more independent of local terrestrial access facilities than is that “primary” terrestrially-based cable, coax, or fiber-based access.
Perhaps one day wireless may yet be a profound game changer for everyone. People have been claiming that that day has now arrived for about two decades now — who knows, maybe one day it’ll actually be true — but not this day. In the mean time, metered Internet access pricing has even less credible justification today than it had in 1996, when it was categorically rejected first in the US market, and subsequently in every other advanced industrial economy in the Americas, Europe, and Asia.
think of metered pricing in terms of arguments about “death panels” and “moral hazard” in health care
on grounds of moral hazard, the ISPs effectively assert that bandwidth use is “underpriced” in a similar way health care is, through underpriced co-pays and deductibles, for which the remedy is to raise price as an appropriate price signal to reflect cost in order to suppress overconsumption (and reduce congestion)
consumers of flat-rate pricing plans see this like a “death-panel” solution, in the sense that capping and pricing Megabyte volume use with arbitrary limits amounts to rationing, because they’ve already paid for the bandwith (Medicare), and now are forced to use it less in order to reduce its cost, which is particularly disturbing when part of the objective is to shift the freed-up use to others (more low users of Megabytes/those with no health insurance)
any way it’s parsed, one thing is certain economically, as long as unit price substantially exceeds the off-peak cost of an additional Megabyte which is near zero, it’s not an efficent outcome because it suppresses consumption in the wrong direction