Tales of the Sausage Factory:
DISH DE Debacle Part 2: So What Did The FCC Actually Do?

In Part 1, I gave a rather lengthy explaination of the factual background why DISH now owes the FCC another $3.3 billion dollars more than the $10 billion it already owed for licenses won in the big FCC spectrum auction at the end of last year (the AWS-3 auction). Here, I give my analysis of the Order denying SNR and Northstar applications for designated entity (DE) credits. Some thoughts on broader implications, what may or may not happen next, and my personal opinion on whether the FCC was right or wrong, I save for Part 3.

 

More below . . .

Continue reading

Tales of the Sausage Factory:
New D.C. Circuit Decision Knocks Fairly Large Hole In Anti-Net Neutrality Case.

Every now and then, the D.C. Circuit throws you an interesting little curve ball. This opinion issued last week would appear to knock a serious hole in the argument made by the cable and telcos against the FCC’s reclassification of broadband as a Title II telecom service.

 

The case, Home Care Association of America v. Weil (HCAA) addresses the legal question that takes up about a quarter of the main brief for petitioners: does the Brand X decision that the Telecom Act was “ambiguous” mean that the FCC gets deference under the Chevron Doctrine when it reexamines the question in 2015 and comes out the other way? Or can Petitioners argue that the statute is not ambiguous and explicitly precludes the interpretation the FCC now gives it? Under HCAA, the D.C. Circuit appears to find that once the Supreme Court decides a statute is ambiguous, that settles the question. If the statute was ambiguous for an interpretation in one direction, it is still ambiguous — and thus subject to Chevron deference — when the agency reverses course. Nor does the agency have a higher burden when it reverses course then it did when it first made the decision.

 

Good lawyers can always distinguish cases, of course — as can a conservative panel of the D.C. Cir. that wants to find a particular result. Furthermore, Petitioners have lots of other arguments to make that are not impacted by the HCAA decision. Nevertheless, it seems clear this case is good news for the FCC (and those of us who support the FCC), and Petitioners will no doubt need to spend a good portion of their reply brief explaining why HCAA doesn’t dictate the result here.

 

I explain in more detail below . . . .

Continue reading

Tales of the Sausage Factory:
So What’s This “Designated Entity” Thing, and Why Does DISH Owe The FCC $3 bn When They Didn’t Break The Rules?

Generally, I loath the cliche “be careful what you wish for.” But I can think of no better way to describe the vast consternation in the spectrum world over the licenses won by SNR and Northstar in the AWS-3 Auction. If you don’t recognize the names off-hand, that’s because most of the time people just refer to them as the “DISH Designated Entities” or the “DISH DEs.” As detailed in many articles and petitions to deny SNR and Northstar their DE credits (totaling $3.3 billion), most people regard SNR and Northstar as “sham” or “fake” DEs, owned and controlled by DISH.

But here’s the funny thing. As far as anyone can tell from the filings, DISH, SNR and Northstar followed the precise letter of the law. And, what’s even more surprising, if you look at the results, this was the most successful auction ever for DEs. Both SNR and Northstar are minority owned (as defined by the FCC’s rules). All the “loopholes” DISH used with regard to ownership interest and bidding coordination were designed to make it easier for DE’s to get capital, win licenses, and benefit from partnering with a larger telecommunications company — which SNR and Northstar certainly did.

As a result, as noted by my usual frenemies at Phoenix Center, as measured by every traditional metric, the AWS-3 auction was the single most successful auction in awarding licenses not merely to small businesses, but to minority owned firms specifically. By every past criteria ever used, the AWS-3 auction results ought to be celebrated as a ginormous success for the DE program. Every aspect worked exactly as intended, and the result was exactly what people claimed to want. Indeed, as noted by Phoenix Center, even the $3.3 bn in bidding credits was in line with other spectrum auctions as a percentage of revenue.

Except, in classic “be careful what you wish for” fashion, when you scaled these results up to their logical outcome, no one was really happy with the result (except for DISH). Which has now prompted FCC Chairman Tom Wheeler to circulate an order denying SNR and Northstar their designated entity credits. As a result, SNR and Northstar (meaning their financial backer DISH) must cough up $3.3 bn within 30 days of issuance of the Order or — unless granted a stay or extension — the licenses will revert back to the FCC. Oh yes, and the FCC might need to deduct an additional $10 bn from the auction revenue. And there might be default charges (the FCC charges a penalty for defaulting on payments so people don’t bid and hope they find the money later). Or it might get more complicated, since there has never been a clawback of this magnitude before.

 

In Part 1, I will explain what exactly happened, why DISH did not violate the rules as written and why SNR and Northstar are technically “minority owned.” Along the way, we will consider some delightful ironies about the whole business.

In Part 2, I’ll tackle why the FCC decided that it could yank the DE discount anyway, and try to figure out what happens next.

More below . . . .

Continue reading

Neutrino:
Google Studies the Obvious: People Hate Interstitial Popups

One of the (many) things that piss me off is the growing plague of modal popups (also called interstitials) that seemingly every site deploys these days. These are the popups that dim the screen and take over the web page you just loaded demanding that you “Like us on Facebook!” or “Join Our Email List!” To proceed, you have to find and click on the (often tiny, obscure) X or dismiss button (which surprisingly is never labelled “F**k Off”, which is exactly what I utter when that happens) just to even see what’s on the site.

My reaction when I see this is immediate: I hit the back button. If you’re near-sighted enough ask me to like your site or give you my email address before you give me a chance to look at it for half a second, I can safely assume you too stupid to actually present content I want to see. I guess, in a way, it does me a service. It’s a nice filter. I won’t waste time on that stupid site. But I really get annoyed being slapped in the face again and again by aggressive levels of stupid.

Continue reading

Tales of the Sausage Factory:
What the Heck Is The “Duplex Gap” And Why Has It Blown Up The July FCC Meeting?

Difficult as it is to believe, there are times in policy when issues do not break down simply by partisan interest or into neat categories like incumbents v. competitors or broadcasters v. wireless carriers. Sometimes — and I know people are not gonna believe me on this – issues break down on pure substance and require lots of really hard choices. Of course, because these issues are highly technical and complicated, most people like to ignore them. But these kinds of issues are also usually the hardest and most intractable for people who actually care about what the world looks like and how this policy decisions will actually work in reality.

 

So it is with the question of whether to put broadcasters in the duplex gap as part of the repacking plan in the incentive auction. Did your eyes glaze over yet? Heck, for most people, it’s gonna take a paragraph or two of explanation just to understand what that sentence means. But even if you don’t know what it means, you can understand enough for this basic summary:

 

  1. Just about every stakeholder in the auction — wireless carriers, broadcasters, wireless microphone users, tech company supporters of using unlicensed spectrum in the broadcast bands, public interest groups — all told the FCC not to put broadcasters in the duplex gap.

 

  1. Nevertheless, the Auction Team proposed putting broadcasters in the duplex gap, based on a set of simulation that suggested that the FCC would only get back 50-60 MHz of spectrum to auction if they protected the duplex gap. The Chairman circulated a draft order adopting the Auction Team’s proposal.

 

  1. Everybody freaked out. The Chairman found he did not have 3 votes, or possibly not even 2 votes, to adopt his proposal on duplex gap. The freak out is so intense and so bad that the FCC actually waived the Sunshine Period for this itemso that interested parties can continue to talk to FCC staff and commissioners until the night before the meeting. The FCC also released additional data showing the impact would be limited to a relatively small number of cities.

 

  1. That helped some, but not enough. Despite progress on negotiations, the FCC clearly did not have time to get to the right solution in the 5 days between the release of the new data and the actual vote. Also, a bunch of people were pissed that the Auction Team hadn’t released the data sooner, and hadn’t provided more explanation of the underlying model and the assumptions behind it. On Tuesday, the Republican Chairs of the House Energy & Commerce Committee & the Telecom Subcommittee wrote Wheeler a letter chastising him for having a bad process and calling on Wheeler to pull the item from the agenda entirely. On Wed., the day before the vote, Wheeler wrote back defending the process but agreeing to pull the item (and the associated item on whether or not to change the spectrum reserve) until the August Meeting three weeks from now.

 

In Policyland, this passes for high drama. It is, to say the least, highly unusual. Enough so that even folks who find technical issues like this complicated and boring to the point of insanity are asking: “what the heck just happened there? Who lost and who won?” The equally complicated answer: “no one lost or won, we’ve got a serious debate about a technical problem which has consequences no matter how you resolve it” is not nearly as satisfying as “the carriers” or “the tech companies” or whatever.

 

I explain and unpack all of this below, as well as consider possible impacts and ways to resolve this. But again, I want to stress this is a super hard problem. This is about competing goals and the difficulty of predicting the future with any certainty. It’s also about trust and stuff, which is hard to come by in Washington even at the best of times. This is not subject to simplistic plotlines like “Oh, the Auction Team are out of control” or “The broadcasters and unlicensed supporters are just being stubborn.” (Wait, the NAB and the unlicensed guys and the wireless microphone guys are on the same side? And they agree with Verizon? WTF?) This stuff is hard.

 

More below . . .

 

Continue reading

Tales of the Sausage Factory:
The First Net Neutrality Complaint Under The 2015 Rules Is Likely To Lose, And That’s A Good Thing.

As reported by Brian Fung at Washpo and others, a company called Commercial Network Services (CNS) has filed the first network neutrality complaint under the FCC’s new rules — which went into effect June 12 after the D.C. Circuit denied a stay request. You can read the complaint here. While I probably should not prejudge things, I expect the FCC to deny the complaint for the excellent reason that — accepting all the facts alleged as true — Time Warner Cable did absolutely nothing wrong.

 

I elaborate on what CNS gets wrong, why this differs from other high-profile disputes like Cogent and Level 3, and why such an illustration is good for the FCC’s rules as a whole, below . . .

 

Continue reading

Inventing the Future:
Hackers and Painters

I went to http://www.paintingvirtualreality.com last weekend.

Billed as the World’s First Virtual Reality Painting Exhibition, it featured:

  • artwork one could view individually, using a Head Mounted Display (HMD) with single-camera tracking;
  • artists at work wearing HMD with dual lighthouse tracking (and the results displayed live on a big monoscopic screen).

The work was done with http://www.tiltbrush.com, which appears to be a couple of guys who got bought by Google. The project – I’m not sure that its actually available for sale – appears to be evolving along several dimensions:

  1. 3d Model Definition: covering stroke capture (including, symmetric stroke duplication), “dry” and “wet” (oil paint-mixing/brushwork effects), texture patterns, volumetric patterns, emmision, particles.
  2. Interactive Model Creation: tool pallette in one hand, and brush in the other.
    1.   Videos at tiltbrush.com suggest an additional moveable flat virtual canvas (a “tilt canvas”?) that one can hold and move and paint against. The art on display was clearly made this way, as they all felt like they were a sort of a rotational 2.5D — the brush strokes were thin layers of (sometimes curved) surfaces.
    2. The artists last night appeared to be working directly in 3D, without the tilt canvas.
    3. The site mentions an android app for creation. I don’t know if it is one of these techniques or a third.
  3. Viewing: HMD, static snapshots, animated .gifs that oscillate between several rotated viewpoints (like the range of an old-fashioned lenticular display).

  I haven’t seen any “drive around within a 3D scene on your desktop” displays (like standard-desktop/non-HMD versions of High Fidelity).

  The displays were all designed so that you observed from a pretty limited spot. Really more “sitting”/”video game” HMD rather than “standing”/”cave” exploration.

My reactions to the art:

  • Emmission and particle effects are fun in the hands of an artist.
  • “Fire… Light… It’s so Promethean!”
  • With the limited movement during display, mostly the art was “around you” like a sky box, rather than something you wandered around in. In this context, the effect of layering (e.g., a star field) – as if a Russian doll-set of sky boxes (though presumably not implemented that way) – was very appealling.

Tip for using caves: Put down a sculpted rug and go barefoot!

Tales of the Sausage Factory:
Broadband Access As Public Utility — My Speech at Personal Democracy Forum.

On June 4, I gave a speech at Personal Democracy Forum (PDF) on Broadband Access As Public Utility (the official Title was The Internet As Public Utility, but my original title and my conception still is about broadband access specifically because “the Internet” has become a very vague term). For those unfamiliar with PDF, it is a truly awesome conference organized by Micah Sifry and Andrew Rasiej that brings together folks from all over the tech world to discuss how tech can make a better world and be an expression of our values. This year’s focus was on how tech can facilitate civic engagement. This year was my first time to PDF, but I am definitely going to do my damndest to come back next year.

 

I’m pleased to say my speech was well received.  I’ve included the video below. (You can find videos of the other speakers in the PDF15 Archive.)  My speech turned out to be about 15 minutes long, which means it was 3 minutes over. Even so, there are some significant differences between what I wrote in advance and as actually delivered (which happens to me often), which is why I reprint my original “as prepared” remarks below the fold.

 

A few basic points I want to make as take aways. As I keep stressing, the term “utility” and “public utility” does not imply any particular mode of regulation or requirement for natural monopoly or market power. The term goes back to the concept first elaborated in Adam Smith’s Wealth of Nations on the purpose of government, including: “the duty of erecting and maintaining certain public works and certain public institutions, which it can never be for the interest of any individual, or small number of individuals, to erect and maintain; because the profit would never repay the expense to any individual or small number of individuals, though it may frequently do much more than repay it to a great society.” The Federalist Papers further expands on this idea, justifying the Constitution as necessary to create a government sufficiently “vigorous” to meet the needs of the people.

 

The innovation of the post-Civil War era was to identify services which, although provided in many cases by the private sector, were too important and too central to society to be left wholly to the dictates of the market and private companies. It is in this sense that Franklin Delano Roosevelt meant “utility” in his letter to Congress calling on creation of the Federal Communications Commission, which begins: “I have long felt that for the sake of clarity and effectiveness the relationship of the Federal Government to certain services known as utilities should be divided into three fields: Transportation, power, and communications.” To use the older statutory language, these services are “affected with the public interest,” and therefore government has a responsibility to ensure their fair, affordable ubiquitous availability.

 

I argue that broadband, in the tradition of all our previous communications services, now falls into this category of services so essential that they are public utilities. I do this knowing full well that those opposed to any form of government oversight of essential service or opposed to the public provision of critical infrastructure will deliberately misconstrue this to mean traditional rate-of-return regulation. To this I can only say *shrug*. The first step in ensuring proper broadband policies lies in reclaiming the term public utility for what it really means — a service so essential that the government has a responsibility to ensure that, one way or another, everyone has fair and affordable access. We must embrace that fundamental value as firmly as we should reject a return to rate regulated private monopoly provision — or the worse alternative of entirely unregulated private monopoly provision.

 

Enjoy.

 


Continue reading

Tales of the Sausage Factory:
Net Neutrality Litigation: Round 1 Goes To the FCC.

Good news! The D.C. Circuit denied the request by the carriers suing the Federal Communications Commission (FCC) to prevent the FCC’s net neutrality rules and reclassification of broadband as a Title II telecom service. As of today, the Net Neutrality rules are in effect, and broadband access is once again a Title II telecommunications service — pending the final outcome of the lawsuit challenging the the FCC’s actions.

 

Reactions from net neutrality opponents have ranged from defiance to “no biggie” with a side of trying to claim a partial win for getting expedited briefing (I’ll explain below why this is a tad disingenuous). On Twitter, I did see a few of my opposite numbers wailing and gnashing their teeth, at the prospect that their beloved Broadband Equestria ruled by the wise Queen Comcast Celestia and Princess Verizon Twilight Sparkle is now going to be converted into a Hellscape overrun with Tyrannosaurus Tariffs that will devour helpless ISPs like tourists dumb enough to go to Jurassic World. Needless to say, supporters of net neutrality and Title II, like my employer Public Knowledge, have been somewhat more upbeat.

 

So what does all this mean for the litigation and the ongoing machinations in Congress around net neutrality? Short version — the court was not impressed with the arguments of the carriers that the FCC was so whacky crazy power-usurping unlawful that this case is the slam-dunk reversal the carriers and their cheerleaders keep saying it is. Mind you, that doesn’t mean the FCC will win. But it does mean that opponents of net neutrality and Title II might want to ratchet back the TOTAL CONFIDENCE OF VICTORY they have exuded until now just a wee bit. It also provides a psychological lift to the pro-net neutrality side that the FCC can win this even in the D.C. Circuit.

 

On the political side, Republicans had hoped that a stay would push Democrats to the bargaining table to avoid the litigation risk. Because the FCC’s odds improve with the denial of the stay, this may have the opposite effect, with Democrats more likely to wait for a court decision rather than try to strike a deal. This could either prompt Republicans to sweeten their offer, or double down on efforts for total repeal.

 

I provide the longer version below . . .

Continue reading

Inventing the Future:
Two Great Summaries, and “Where do ideas live?”

executive-summary

Philip gave a terrific quick demo and future roadmap at MIT Technology Review’s conference last week. See the video at their EmTech Digital site.

Today we put up a progress report (with even more short videos!) about the accomplishments of the past half year. Check it out.


 

I wonder if we’ll find it useful to have such material in-world. Of course, the material referenced above is intended for people who are not yet participating in our open Alpha, so requiring a download to view is a non-starter.  But what about discussion and milestone-artifacts as we proceed? At High Fidelity we all run a feed that shows us some of what is discussed on thar interwebs, and there are various old-school IRC and other discussions. It’s great to jump on, but it kind of sucks to have an engaging media-rich discussion with someone in realtime via Twitter. Or Facebook. OR ANYTHING ELSE in popular use today.

William Gibson said that Cyberspace is the place where a phone call takes place. I have always viewed virtual worlds as a meta-medium in which people could come together, introduce any other media they want, arrange it, alter it, and discuss it. Like any good museum on a subject, each virtual forum would be a dynamic place not only for individuals to view what others had collected there, but to discuss and share in the viewing. The WWW allows for all of this, but it doesn’t combine it in a way that lets you do it all at once. Years ago I made this video about how we were then using Qwaq/Croquet forums in this way. It worked well for the big enterprises we were selling to, but they weren’t distractible consumers. High Fidelity could be developed to do this, but should we? When virtual worlds are ubiquitous, I’m certain that they’ll be used for this purpose as well as other uses. But I’m not sure whether this capability is powerful enough to be the thing that makes them ubiquitous. Thoughts?