Could the FCC Structure A Broadcaster Clearance Auction Without Congress? Yeah, actually . . .

Progress and Freedom Foundation has recently published this piece by Adam Theirer and Barbara Esbin on how encouraging a deal between broadcasters and wireless providers to reduce the spectrum used by broadcasters and auction more spectrum for wireless use would serve the public interest. The piece raises some good points. For one thing, it is happily free of the “broadcasters are obsolete and we ought to take their spectrum back” rhetoric that often accompanies these proposals (not from PFF, I should add, but from a number of others). But the paper is woefully short on specifics. It touts the value of such a deal (freeing up spectrum for wireless) and lays out some general approaches, then urges the FCC and Congress to broker a deal between the broadcasters and the wireless industry through a number of possible auction mechanisms.

And now, the FCC has issued a public notice in the National Broadband Plan soliciting input on what they should think about using broadcast spectrum as part of the national broadband plan.

This got me thinking. Is there a mechanism the FCC could use, consistent with existing law, which would allow for the sort of broadcast band clearance the FCC would like to see? And, as a bonus, could this also clear some space for white space use? After some consideration, I hatched the scheme below. It is somewhat slower than than the wireless industry would like. I expect it would take about 5 years to finish the transition. But that is not bad given that it took 4 years to manage the DTV transition and auction from the time Congress set the hard date in 2005 to the end of analog broadcasting in June 2009. Also, my plan would allow continuing gradual build out, and combines some sticks to go with the carrots.

I’ll add that I’m not convinced this is worth doing. I think the current obsession with broadcast spectrum as the solution for the upcoming spectrum crisis suffers the same myopia as focusing on offshore drilling to cure the energy crisis — it defers the crunch but doesn’t solve the underlying problem. Wireless demand is going to continue, and we need to fundamentally change how we manage spectrum access (rather than spectrum allocation) to remain on a sustainable path for growth. I also point out, as we discovered while doing the broadcast white spaces proceeding, that there are a lot of non-broadcast uses in the existing television bands that are not broadcast users. These secondary services are going to get awfully squeezed if we crunch the broadcast bands further.

All that said, a well constructed auction could free up a nice chunk of spectrum in the short term that could promote wireless services and competition — especially if it came with a spectrum cap so VZ and AT&T didn’t hog all the good stuff again.

More below . . . . .

Continue reading

An Open Letter To Blair Levin On The Subject of National Broadband Public Notices

Dear Blair:

I surrender! I admit defeat. I cry “uncle.” You win. Despite my earlier doubts, I am now prepared to say the National Broadband Plan process is the most open, transparent, comprehensive, bestest and wonderfullest proceeding ever in the entire history of the FCC since passage of the Communications Act of 1934! Just please, please PLEASE no more public notices. [break off into uncontrolled sobbing]

Continue reading

Will Comcast/NBC Need FCC Approval? And How Would That Play Out?

The industry news is abuzz with the upcoming Comcast/NC Universal Deal. According to recent reports, Comcast would buy 51% of NBC Universal (assuming Vivendi, which owns 20% at the moment, agreed with the terms). But beyond this general framework, it’s unclear whether all the assets held by NBC Universal would be included in the deal. Whether or not the FCC has jurisdiction hinges on this question.

The FCC does not have general jurisdiction over deals pertaining to content. NBC Universal owns lots of radio and television stations. Transfer of the licenses to the new Comcast-controlled entity would require FCC approval. But if the deal does not include the licenses, the FCC would probably lack a jurisdictional hook. Review of the deal would lie strictly in antitrust — at either the DoJ or Federal Trade Commission (FTC). From an antitrust perspective, the deal raises some concerns given the concentration of content and Comcast’s position vis-a-vis other existing subscription television providers (e.g., FIOS, DIRECTV) and potential new competitors (e.g., Netflix and other “over the top” video providers)). It may also concern broadcasters, both NBC affiliates worried about the change in management and other broadcasters worried how this would impact Comcast’s retrans negotiations. Much of this will also depend on whether the deal includes the movie production studios, prior existing content, and a host of other details that impact the universe of content distribution these days.

Assuming the TV and/or radio stations are included, it’s not entirely clear what happens. The D.C. Circuit eliminated the FCC’s existing ban on cable/television cross ownership (which applied only to broadcast licenses in a cable system’s franchise area) in 2002 on the basis that the D.C. Circuit didn’t like it (Fox Television Stations, Inc. v. FCC, 280 F.3d 1027 (D.C. Cir. 2002). That decision does not directly impact the FCC’s general obligation under Section 310(d) to ensure that any transfer of a license serves the public interest. Comcast and NBC will certain push the Fox Television decision for all its worth, arguing that the DC Circuit decision to vacate the rule means that there are no circumstances under which the FCC could prevent a broadcast/cable cross-ownership rules. Opponents will argue that while the D.C. Circuit vacated a per se rule that any cable/broadcast combination was contrary to the public interest, that has zero impact on the Commission’s responsibility to resolve the question of whether transfer of these licenses to this cable company serves the public interest. I expect much confusion and argument on this point. Assuming the FCC has jurisdiction in the first place.

Stay tuned . . . .

Why Don’t Broadcasters Become “Spectrum Innovators?” Because They Like Being Broadcasters.

Can’t help but take a brief break from the Net Neutrality craziness to be mildly amused at Adam Thierer over at Tech Liberation Front. We have an increasing number of reports that Blair Levin wants to bribe broadcasters to get off their spectrum as part of the national broadband plan. Adam is very excited by this and, of course, brings up the usual Libertarian argument that because property solves all problems, we should just make the broadcast licenses property of the broadcasters and let the endless innovation begin.

The problem with argument is that broadcasters could already do this. Under 47 USC 336(b), broadcasters can use their digital spectrum to provide “ancillary and supplementary services.” In a series of orders, the FCC has said that as long as full-power broadcasters provide one free over the air digital channel, they can do whatever they want with the remaining spectrum — including lease it out in the secondary markets to someone else. Under the statute, broadcasters need to pay a fee for any such ancillary services that would be the functional equivalent of what the broadcasters would have paid for the spectrum at auction (47 USC 336(e)), which the FCC has fixed at 5% of any annual revenue from the ancillary services.

Continue reading

Very Interesting Map Of Comments In BB Stimulus Proceeding

In my capacity of consulting with the Benton Foundation, I have been doing work with Kate Williams, a professor of informatix at University of Illinois. Williams has been doing some (IMO) critical work around broadband sustainability. In particular, Kate has been studying the old Technologies Opportunity Program to determine which projects had lasting impact and which didn’t — a rather important consideration for the new and improved BTOP program.

But what caught my attention recently is this very interesting map that Williams compiled based on the comments submitted to BTOP. It places the comments filed on a geographic map, with links to the actual comments themselves. The map includes the 58% of comments filed by the April 13, 2009 deadline which contained reliable information on the location of the commentor. The remaining 42% either gave no location or included location in an attachment which Williams considered insufficiently reliable to determine location.

Why do I find this interesting? Because it potentially provides a very interesting cross check on the state of broadband geographically, as well as who follows these proceedings. I have long lamented that the FCC (and other federal agencies) make so little use of the data they actually collect. At best, an agency may note submission by a class of commentors (e.g., broadcasters, MVPDs, ISPs) in the specific proceeding at issue. But no one tries to take the multiple data sets collected as comments in each proceeding, or in multiple proceedings, and tries to determine patterns and what they might suggest. williams grouping by geography is intriguing, and I cannot help but wonder what would happen if we applied a similar analysis to multiple FCC proceedings — including for comments generated by mass “comment engines” that have become common in some high profile proceedings. It would be very interesting to know, for example, if the people feeling passionate enough about media consolidation or network neutrality cluster geographically and, if so, do we see patterns of geographic interest which might tell us about the actual situation on the ground.

Of course the sampling from comments is not a pure scientific data set in that to comment, a commentor must (a) know about the proceeding, and (b) feel strongly enough to file comments. But the fact that the information has a particular set of biases does not render it meaningless, especially if one controls for this.

I hope researchers use Williams’ map, both to analyze the BTOP comments and as a model going forward for analysis of other proceedings.

Stay tuned . . . .

2:30 P.M., Still No Meeting . . . .

O.K., I hope tonight’s election results go better. Rumor is the hold up is on roaming conditions in the VZ/Alltel merger. Still, after the DOJ approved the merger with a few divestitures, there was no doubt that the FCC will roll over. The only question is whether Tate or McDowell will side with the Ds to exact some additional conditions for the benefit of the rural carriers or competitors. Hence the speculation that this involves roaming. But I still expect a vote today. You can almost hear the Verizon charatcer in the Alltel ads whispering “Soon Chad . . . .soon you will share your circle for the last time . . . . you ding dong.”

While we wait, here are some preliminary thoughts about the items.

Here’s the original agenda. The FCC dropped item 1, Universal Service/Intercarrier Compensation (USF/ICC), and voted the item on distributed television systems (DTS) and closed captioning on circulation.

Of these, the voted items were fairly non-controversial. DTS is designed to address the fact that DTV signals don’t work the same way as analog, and will allow broadcasters to maintain their audience after the conversion. The only possible pitfall was whether it would allow broadcasters to expand their footprint which would (a) eat into the available white spaces, and (b) give them yet more free spectrum goodies for no good reason. My info is that the order will emphasize that the intent is to maintain the status quo ante transition. I have no idea on the closed captioning item.

That leaves USF/ICC. USF/ICC is a huge mess of biblical proportions that causes even a hardened policy wonk like me to quail and flee the room screaming. It is famously broken, everyone hates it, but no one can agree on how to fix it. There is absolutely no right answer, and any piece of it impacts all the other pieces.

What is interesting is that this created another 4-1 revolt by the other offices against Martin. While I give Martin credit for trying to get hideously controversial stuff done, you are clearly doing something wrong if you have managed to uniformly piss off all four Commissioners to the point where they are making pointed public statements that boil down to “Kevin, you ain’t the boss of me.” It is always hard for a Chairman to get stuff done in the last months of an administration, but unless Martin and the other offices figure out a way to get along, it is going to be a very viscious and unproductive couple of months until January 21.

The delay on this meeting, which caught Martin totally by surprise, is not exactly an auspicious omen.

Stay tuned . . . .

White Spaces Wrap Up: Exclusive Licensing, Or The Part 101 Poison Pill

As we enter the last 24 hours before the critical and transformative November 4 vote (no, not this one, the FCC vote!), a last battleground has emerged. While the broadcasters and wireless microphone guys have generally not generated any traction, a final possible hitch has shown up on the question of higher power for rural providers. While I applaud the sentiment, this has become the last ditch effort to sneak a “poison pill” into the Order by keeping alive the hope/fear of exclusive licensing in the band.

As I have long warned, the potential last-minute threat to unlicensed in the band would not come from broadcasters, whose interference claims have been discredited and who have stooped to rather ridiculous smear tactics, or even from wireless microphone manufacturers and their vast horde of politically powerful pirate users. No, I have always believed that at the last minute, the real flank attack against the public interest would come from the licensed wireless guys pushing for licensed backhaul.

Which is why I am unsurprised to find the last potential stumbling block toward the finish line, after five years of unprecedented testing and investment, comes from a push for some kind of exclusive licensing scheme, either as an immediate set aside in the existing order or as part of a further proceeding.

I call this the “Part 101 Poison Pill.” Part 101 of the FCC’s rules governs high-power point-to-point transmission links of the sort used by telecommunications companies for transmitting significant distances. Part 101 is different from cellular licensing, in that it can accommodate multiple users on a “first in time, first in right” basis. Whoever comes in later must protect everyone who comes in earlier, which essentially makes it a very high-cost game of “king of the mountain.”

What makes exclusive licensing, even the relatively more open licensing such as Part 101, such a poison pill for unlicensed?

See below . . . .

Continue reading

Will The FCC Create An ICANN for White Spaces?

Mind you, I am generally pleased with the announcement by FCC Chair Kevin Martin that the exhaustive study of possible white spaces devices by the Office of Engineering and Technology (OET) proves that the FCC can go to the next step and authorize both fixed and mobile unlicensed devices. I shall, God and the Jewish holiday schedule permitting, eventually have more to say on the subject. But I can’t help but focus on one aspect of Martin’s generally outlined proposed rules that raises questions for me.

See, I spent a lot of time back in the day working on domain name policy with the Internet Corporation for Assigned Names and Numbers (ICANN). ICANN derives its authority through control of the authoritative list of top-level domain servers (“the root zone file”). Or, put another way, the entire structure of ICANN, which now has a budget in the tens of millions and an entire cottage industry that surrounds it, is based on the fact that ICANN controls access to a list that you must have in order to get internet access.

So I’m very curious about who will control the database that will work to supplement sensing as a way to protect over-the-air broadcasting and operation of (legal?) wireless microphones. If the FCC administers this database, and makes it freely available online, then things will work fine. The FCC is already supposed to maintain such a database, because it supposedly keeps track of every license and licensees have a responsibility to keep their license information current. In practical terms, it would cost some money and effort to upgrade the existing database to something easily accessed and updated on a dynamic basis, because the FCC has let this lapse rather badly. (Not their fault, really. No one likes to pay for “back office” or “infrastructure” and it has never really risen to anyone’s priority level.) OTOH, it means that actually upgrading the FCC’s existing database, and giving broadcasters and wireless microphone licensees incentive to keep their information current, will yield benefits beyond making geo-location possible.

OTOH, if the FCC outsources this function, it will be an invitation to disaster. A database manager –particularly an unregulated one — will have every incentive to charge for access to the database. While I don’t expect anything on the scale of ICANN, the possibility for real bad results goes up exponentially if no one pays attention to this kind of detail. Will the database manager get exclusive control? Will the database manager be able to set its own fees for access to the database? How will the database manager be held accountable to the broader community? These are questions that need to be answered — either in the Report and Order or in a Further Notice of Proposed Rulemaking.

My great fear is that the FCC will treat this as the equivalent of a frequency coordination committee. But it isn’t anything like a frequency coordination committee, since the whole point (from my perspective) is to open up access for everyone and not just for a handful of industry folks who can work the process and pay the fees. Worse, if the FCC delegates this to the broadcasters themselves, it will create an incredible opportunity to hamstring the process at the critical access point.

On the plus side, perhaps we can get Susan Crawford to go from an ICANN Director to an FCC Commissioner.

Stay tuned . . . . .

White Spaces Update — Field Testing Can Be Soooo Educational. You Always Find Something You Don’t Expect.

As folks may recall, the primary opponents of opening the broadcast white spaces for use, the broadcasters and the wireless microphone manufacturers — notably our good friend and radio pirate Shure, Inc. (official slogan:“We get to break the law ’cause we sound so good”) — insisted that the FCC conduct field tests on the white spaces prototypes. Of course, because these are concept prototypes and not functioning devices certified to some actual standard, everyone knew this would leave lots of leeway for the broadcasters and the wireless microphone folks to declare the “tests” a “failure” regardless of the actual results. Which, of course, they did. Needless to say, Phillips (which makes one of the prototypes) said the opposite, and it all depends on whether you mean “the device functioned perfectly as if there were actually some standards for building a functioning device” or “the device proved it could detect occupied channels at whatever sensitivity the FCC decides is necessary.” The FCC engineers, wisely, made no comment and went back to their labs to analyze the actual data.

But one of the nice things about field testing is that you learn the most amazing things that you can never learn in a lab, as demonstrated by this ex parte filed by Ed Thomas for the White Spaces Coalition, the industry group that backs opening the white spaces. Apparently, in front of eye witnesses (including the FCC’s engineers), both broadcasters and unauthorized wireless microphone users in the Broadway field test operated wireless microphones on active television channels, at power levels well above what white spaces advocates propose for mobile devices. All apparently without interfering with anybody’s television reception or even — in the case of the unauthorized Broadway users — screwing up the hundreds of other illegal wireless microphones in the neighboring theaters.

A few rather important take aways here: (1) the danger of interference claims by broadcasters and Shure are utterly bogus, as the wireless microphones do not screw up either television reception or each other; (b) the broadcasters and Shure know their interference claims are bogus. If they actually cared on iota about possible interference, they would not casually operate high power wireless microphones on the same channel as active television broadcasts and as each other. Instead, they are so unconcerned about interference that they can’t even remember to pretend to care about basic interference concerns when they are conducting a field test in front of the FCC’s own engineers.

A bit more elaboration on these points below . . . .

Continue reading

A Fatal Exception Has Occured In Your White Spaces Sensing Device

It would be funny were it not so easy for NAB to exploit.

The Microsoft prototype shut itself down last week and would not restart. Users familiar with MS products that are scheduled for release, never mind pre-beta versions, will find this so unremarkable as to wonder at the sensation. It goes up there with “Apple denies latest i-rumor.”

Unsurprisingly however, the folks opposed to the use of white spaces (primarily the broadcasters and the wireless microphone folks, with a dash of the cable folks thrown in for good measure), will spin this as the entire technology for sensing if a channel is occupied as “failing.” This ignores the other prototypes of course (Phillips and Google), and ignores the fact that the failure had nothing to do with the sensing (the thing being tested). Finally, of course, it ignores the fact that this is a proof of concept prototype.

The fact is, that the FCC testing shows that “sensing” as a technology works at levels that easily detect operating television channels and even wireless microphones. In fact, it is too bloody sensitive. In a foolish effort to appease the unappeasable, the companies submitting prototypes keep pushing the level of sensitivity to the point where the biggest problem in recent rounds appears to be “false positives.” i.e., it is treating adjacent channels as “occupied.”

As a proof of concept, that should be a success. The testing demonstrates that you can detect signals well below the threshold needed to protect existing licensees. Logically, the next step would be to determine the appropriate level of sensitivity to accurately protect services, set rules, and move on to actual device certification based on a description of a real device.

But that is not how it works in NAB-spin land. Instead, NAB keeps moving the bar and inventing all sorts of new tests for the devices to “fail.” For example, the initial Public Notice called for prototypes for “laboratory testing.” MS and Phillips submitted prototypes that performed 100% in the lab. But then, the MS people did something very foolish, but very typical — they decided their laboratory device was good enough for field testing. No surprise, it did not work as well in the field as in the lab. As this was a laboratory prototype, the failure to perform flawlessly in the field should have been a shrug — it would have been astounding beyond belief if a prototype designed for the lab had worked perfectly the first time in the field. But the fact that the prototype did not work in the field was widely declared a “failure” by NAB, which unsurprisingly gave itself lots of free advertising time to spin the results this way.

So the FCC went to round two, and again the NAB and white spaces opponents have managed to move the bar so they can again declare a “failure.” Back in 2004, when the FCC first proposed opening the white spaces to unlicensed use, it concluded that operation of white spaces devices would not interfere with licensed wireless microphone users. The FCC has never reversed that determination. Unsurprisingly, businesses developing prototypes according to the FCC’s proposed rules have not taken particular care to address wireless microphones. Because the FCC explicitly said “don’t worry about them.”

But suddenly, if the devices can’t accurately sense and detect wireless microphones, they will be “failures.” It doesn’t matter that the devices have proven they can protect wireless microphones. It doesn’t matter that Google has proposed additional ways of protecting wireless microphones besides sensing. As long as NAB can frame what defines “failure” (rest assured, there will never be any successes of NAB gets to call the tune), and can keep changing that definition at will, the political environment will ensure that the actual engineering is irrelevant.

Which is why the companies need to stop trying to placate the NAB by agreeing to an endless series of tests with ever-shifting criteria. And OET needs to write up a report that does what the initial notices promised to do, use the data collected from prototypes to determine if the concept works and, if so, to set appropriate technical standards. The prototypes have proven they can detect signals with a sensitivity better than an actual digital television set or wireless microphone receiver, so the “proof of concept” aspect stands proven. Rather than buy NAB spin, the next step should be to determine what level of sensitivity to set as the standard.

Hopefully, the Office of Engineering and Technology, which is conducting the tests, will not suffer the fate of the Microsoft prototype and shut down under pressure.

Stay tuned . . . .