Wednesday, November 19, 2008

FCC Releases White Spaces Order

They voted in favor of the white spaces order on November 4, but it is very common for the Commission to finish hashing out the details of orders after they have approved them (odd, but true). The FCC has finally released the full text of the order, which includes a 100mw power limit for personal/portable devices (40mw on adjacent channels) and 4w for fixed devices. I believe this all means good things for white space devices (those were the numbers many advocates were suggesting), but I'll update this post as more coverage emerges.

Friday, November 14, 2008

Congrats Susan and Kevin

Two communications policy experts I tremendously admire were just named heads of Obama's FCC transition review team. Susan Crawford and Kevin Werbach will "ensure that senior appointees have the information necessary to complete the confirmation process, lead their departments, and begin implementing signature policy initiatives immediately after they are sworn in."

Susan was a member of my thesis committee, and Kevin has been tremendously influential in my thinking. I've never told Kevin, but in addition to informing my writing as recently as this week, I learned how to make web sites in 1995 using his Bare Bones Guide to HTML.

The Commission could not be under better transitional guidance.

Tim Lee's Reasonable Retorts

Tim Lee has posted two responses to my critique of his recent article on network neutrality. In his article, he argued that network neutrality is good but that violations of the principle are unlikely to be severe for economic and technical reasons. I find this argument unpersuasive, and he responded with four basic retorts:


1. Net Neutrality proponents don't clearly state what they are seeking to prevent, and thus evade any attempt to disprove their harms.

I’ve found that any time I take one of these ISP strategies seriously and put forth an argument about why it’s unlikely to be feasible or profitable, the response from supporters of regulation is often to concede that the particular scenario I’ve chosen is not realistic...

Lee goes on to list several scenarios, all of which are possible to varying degrees. However, they all fit the simple rubric of network discrimination and they all are harmful. In general, subtle discrimination is more likely than outright blocking. This is something that has been clearly articulated by the mainstream of neutrality proponents for some time. That is why I included reference to Barbara van Schewick's paper. If Tim were choosing to "take one of these ISP strategies seriously" he would have done well to focus on the one that most people are talking about.


2. This type of discrimination is unlikely, isn't that bad, and we can always fix it after the fact.

First, notice that the kind of discrimination he’s describing here is much more modest than the scenarios commonly described by network neutrality activists. Under the scenario he’s describing, all current Internet applications will continue to work for the foreseeable future, and any new Internet applications that can work with current levels of bandwidth will work just fine. If this is how things are going to play out, we’ll have plenty of time to debate what to do about it after the fact.

I am describing a mainstream version of discrimination, which can happen either right now or going forward as operators upgrade their networks but keep non-payers in the slow lane. We have ample evidence of the former in Comcast/BitTorrent. The latter is simply a less visible version of the former -- an even further degree away from Lee's scenario in which consumers have "a taste of freedom", "become acutely aware of any new restrictions," and, "stubbornly refuse efforts to impose them." The fact that carriers are building out faster networks doesn't tell us whether or not this is likely. Carriers will of course build out faster networks, because they typically profit more from them (whether they impose discrimination or not). Given the current uncertain regulatory climate, it is no surprise that they have refrained from additional large-scale discrimination. This climate, however, is temporary. The relevant question is whether or not those network upgrades provide additional shield from the customer backlash that Lee posits. It is clear that they do.

How bad you think this discrimination is depends on how seriously you take arguments about platform economies, dynamic innovation, network effects, and freedom of speech. It also depends on whether or not you think that degrading service achieves most of the ends of outright blocking. I argue that it does. Google's obsession with page load times is not simply because they are hyper-focused engineers. Skype's need for equal network treatment is not just because they want calls to sound nice. The BitTorrent protocol's expectation that connections are not randomly reset is not a matter of convenience.

Lee would have us believe that we will always have the space to regulate these issues, if needed, after the fact. The Comcast order might give us some hope in this regard, except for the tremendous murkiness that surrounds the decision, its implications, and its legal durability. Regulation from the FCC can be roughly thought to fall into two categories: rulemaking and adjudication. Rulemaking explicitly sets out the detailed requirements, whereas adjudication defines basic guidelines and then builds policy through case-by-case enforcement. Lee clearly opposes rulemaking on its face. We are left with adjudication, but in this case he opposes further definition of enforceable principles. This is not ex post regulation, it is no regulation at all.


3. The risks are overblown, and disproved by history.

It’s worth remembering that alarmism about the future of the Web is almost as old as the Web itself.

Lee is not a fan of Lessig's "apocalyptic" predictions in 1999. While Lessig's forecasts undoubtedly have not fully come true ("yet" -- as he notes in the preface to the new edition), we have unquestionably seen some of those trends play out. Increasing control by intermediaries, domestically and abroad, threatens speech and innovation. The "open access" battle that was heating up at that time was not lost until 2005, and since then we have case studies for how the stopgap quasai-neutrality principles are strained. But, I'm not here to defend Lessig (I certainly disagree strongly with him at times).

Rather than debating generally whether past predictions of others have come true, it is more productive to examine the specific issues at hand with the most relevant data points from history and the present. We know that historically corporations tended toward building closed systems like AOL and CompuServe. We know that well-crafted regulatory interventions like common carrier non-discrimination, Computer II, and Carterphone unleashed waves of innovation. We know that carriers today have pursued discriminatory practices and been partially disciplined by somewhat ambiguous regulation. We know that abroad, discriminatory practices have flourished in environments in which intermediaries exercise the most control. We know that domestically in the parallel (and increasingly overlapping) wireless market, market actors impose restrictions that radically limit innovation.

This is not a strong historical or factual case against the need for, or success of, non-discrimination regulation.


4. Steve misunderstands settlement-free peering.

“Settlement-free” means that no money exchanges hands. If D and E are peers [this example assumes that D is a "last mile" backbone provider like Verizon and E and F are competitive "tier 1" providers such as Level 3 or Global Crossing], that by definition means that E pays D nothing to carry its traffic, and vice versa.

This technical/wonky definition is at the heart of what I consider Lee's most original, but nevertheless misguided, argument. The basic idea he posits is that because a certain set of backbone providers traditionally negotiate no-fee interconnection agreements, there is no ability for last-mile providers to leverage their power in the consumer market into the backbone market.

Let's go back and define a couple of key terms. First, "settlement-free peering" means, as Lee accurately describes, an arrangement between two providers in which they do not exchange money but simply agree to carry each others' traffic. They do so under detailed and confidential interconnection agreements that define the terms of this agreement, including things like jitter, latency, throughput, etc. These agreements often require equal treatment by both parties (although they may not speak to those providers' relationships with other providers). Let's assume for the sake of argument that they always do require equal treatment between the two. The types of companies that have these agreements are "Tier 1" backbone providers at the core of the internet -- Level 3, Sprint, AT&T, etc.

Second, "transit" agreements are contractual relationships between unequals. In this case, one party typically pays the other for carrying its traffic under various terms. This is the type of relationship that Comcast has with the Tier 1 providers. For example, here is an excerpt of the traceroute from my Comcast cable modem to google.com:

7 pos-0-3-0-0-cr01.chicago.il.ibone.comcast.net (68.86.90.57)
8 xe-10-1-0.edge1.newyork2.level3.net (4.78.169.45)
9 ae-2-79.edge1.newyork1.level3.net (4.68.16.78)
10 google-inc.edge1.newyork1.level3.net (4.71.172.86)

See that? My packets go from Comcast -> L3 -> Google. Comcast pays Level 3 to transmit their packets, according to some confidential terms that it agrees to. Comcast has a rather large national network (although it is not a "Tier 1" provider) and thus can route its packets around to locations where it has the best bargaining power with the party at the exchange point (in this case, they sent my packets from Boston to Chicago before plugging into L3). Lee's theory is that the settlement-free peering agreements probably don't allow discrimination based on content or source, and he seems to assume that downstream transit agreements are implicated in this obligation because at some point they must interconnect with those backbone providers. Furthermore, he claims that both parties need each other enough that nobody would ever violate these principles.

In my initial critique, I gave several reasons to doubt this claim. First, there is no practical evidence that Tier 1 providers have pressured their downstream transit peers to remain non-discriminatory. This has not been a factor in discrimination disputes that we have seen to date, like Comcast/BitTorrent or Madison River (instead, regulatory threats have brought players in line). Second, there is ample reason to believe that Tier 1 providers would indeed be willing to de-peer despite Lee's assertion that they simply need each other too much (thus I cite the Cogent/L3 dispute as well as the Cogent/Sprint de-peering from a couple of weeks ago). Third, the universe of settlement-free peering is increasingly giving way to varieties of transit agreements in which concessions are made in exchange for payment. Fourth, there are now emerging unified backbone/last-mile networks for which much of the traffic need not pass through a Tier-1 exchange point at all (eg. Verizon/MCI/UUNET). Settlement-free peering has been a powerful norm in keeping content or source-based discrimination out of the core of the network, but even there their strength is waning.

Wednesday, November 12, 2008

Tim Lee's Twin Fallacies

[Edit: Tim replies. I reply.]

Cato has finally gotten around to publishing Tim Lee's article, "The Durable Internet: Preserving Network Neutrality without Regulation." I first saw a draft of his paper in March, and Tim engaged in a good spirited back-and-forth with me over email. The primary failings that I perceived then remain un-addressed in this final version. They are twofold:


1. The fallacy that any non-discrimination regulation is the same as the combined force of all misguided regulation since the advent of administrative agencies

The first problem with Lee's article is that it repeats one of the most common mistakes of certain libertarian sects: assuming that any government regulation is as bad as all government regulation. In Lee's case, the devilish regulation equated with network neutrality is the Interstate Commerce Act, the Civil Aeronautics Board, and the sum of all Federal Communications Commission regulation. This approach mirrors earlier claims by Bruce Owen, Larry Downes, and Adam Thierer, which I rebut here.

Lee begins by observing that "The language of the Interstate Commerce Act was strikingly similar to the network neutrality language being considered today." We should not be surprised that at least some of the non-discriminatory principles found in modern day neutrality proposals resemble those in the ICA. Indeed, net neutrality is inspired in part by elements of common carriage, which cross-pollinated into communications law in the 1910 Mann-Elkins Act (see pp. 21-23 of my thesis for more on this history). The gating question is whether or not the elements of the Interstate Commerce Commission that led to the inefficiencies that Lee claims are at all related to the non-disciminatory language that he claims connect the two. If and only if the answer is "yes," then a responsible analysis would consider whether or not the markets are relatively analogous, whether or not the administrative agencies tend toward the same failures, and whether the costs of regulation truly outweigh the benefits. In short, it is not enough to simply assert that net neutrality smells like the ICA, therefore it is doomed to fail.

I won't discuss the relationship to the Civil Aeronautics Board because I think the analogies are tenuous at best.

Finally, we arrive at the FCC discussion, which holds the most promise for actually being relevant. Unlike Bruce Owen, who inexplicably compares neutrality proposals to the AT&T antitrust proceedings, Lee seeks to equate neutrality with FCC rate-subsidization and market entry prohibitions. He concludes that, "like the ICC and the CAB, the FCC protected a client industry from the vagaries of markets and competition." Perhaps, but why is this similar to non-discrimination regulation?

A more accurate analogy with FCC rulemaking would be to compare neutrality to the non-disciminatory part of common carriage, the Computer Inquiries, Carterphone, or all three. Most scholars recognize that these rules allowed the discrimination-free operation of dial-up ISPs, and facilitated the explosion of the internet. The case of FCC non-discrimination mandates presents a stark counter-example to Lee's assertion of uniform regulatory failure.


2. The fallacy that there is an underlying "durability" of the technology/market structures of the internet that will successfully resist strong carrier incentives

Lee provides a somewhat novel argument when he claims that the internet has built in safeguards against welfare-harming practices like network discrimination. He begins by praising the effects of the "end-to-end" architecture of the internet, in which carriers simply deliver data and allow the "edges" of the network to determine what is sent and how. He thinks that this characteristic does not need to be backed up by regulators because the technology and the market will preserve it.

With respect to markets, his argument is twofold. First he claims that outright "blocking" of services would cause such backlash (from end-users or from content providers) that it would be untenable. Second, he claims that attempts to simply degrade service would not be terribly destructive in the short term, and would provide ample time to craft a regulatory response if necessary.

Lee justifies his customer backlash theory by pointing to cases such as the Verizon/NARAL dispute in which the company initially refused to give the non-profit an SMS "short code" but relented in the face of public outcry. In reality, the outcry came from inside-the-beltway advocates who threatened regulation, but in any event we have a more relevant example in the case of BitTorrent/Comcast, which he also discusses. The regulatory solution in this case is even more obvious, with the FCC ultimately issuing an order against the company (which is now on appeal). There is no evidence whatsoever that these resolutions were driven by users that have "had a taste of freedom" and have, "become acutely aware of any new restrictions," and, "stubbornly refuse efforts to impose them" -- resisting via technical or financial means. Nor is there evidence that, left alone, the markets would have settled on a non-discriminatory solution.

Lee tries to make the case that the technical structure of the internet would have allowed BitTorrent users to simply adopt better ways of hiding their traffic, and would have prevailed in that cat-and-mouse game. This is of course speculation, but it's also irrelavent. Whether or not highly technically savvy users can temporarily evade discrimination has little to do with how such practices would effect the activities of the majority of the population. In fact, we have strong examples to the contrary worldwide, as various regimes develop more and more sophisticated means for filtering their citizens' speech (such as the news today from Argentina). In those situations, there are often many people who can subvert the filters but the practice nevertheless fundamentally alters the nature of what is said, and what innovations flourish (see for example, the rollout and adoption of Google vs. Baidu in China).

Lee also lays out an argument for why the structure of the network itself makes it unlikely that last-mile carriers can successfully threaten blocking. He argues that because the core of the internet is highly interconnected, it would be practically impossible to discriminate against any particular site, and that those sites which are important enough to pay attention to could in turn threaten to stop serving customers from that carrier. In short, they need each other. In many cases this is true, although it doesn't necessarily mean that in all cases this relationship will be more attractive to the last-mile provider when compared to various exclusive relationships (or that even if it is, the provider will behave rationally). Things get even more dicey when we examine them from the perspective of second-tier sites or services, which have not yet achieved the "must have" status but nevertheless present revenue opportunities or competitive risk to the carriers.

Lee claims that even if this occurred, it would not be a real problem because it wouldn't be severe. "To be sure, such discrimination would be a headache for these firms, but a relatively small chance of being cut off from a minority of residential customers is unlikely to rank very high on an entrepreneur’s list of worries." His assumption that the chance of being cut off is "small" is belied by recent experience in the Comcast/BitTorrent case. The idea that one would be cut off only from a "minority of residential customers" is technically true because no one firm currently controls over 50% of residential connections, but there are some truly significant market shares that entrepreneurs would undoubtedly care about. Last-mile providers have duopoly over their subscribers, and a "terminating access" monopoly over current subscribers.

These problems are all made much more severe in an environment in which carriers practice partial discrimination rather than outright blocking. In our email back-and-forth, I told Lee that:

The notion that "D cant' degrade them all, because that would make D's Internet service completely useless" does not hold when you assume that D maintains a baseline level of connectivity (perhaps even at current levels of service) but only offers enhanced delivery to services/sites that pay up. Consumers don't see any change, but the the process of network-wide innovation gives way to source/application-based tiering. Imagine this starting in the era of dialup (you'd have to imagine away the last-mile common carrier safeguards in that scenario). Today I'd only get web-based video from ABC, Disney, etc.

The last-mile carrier "D" need not block site "A" or start charging everyone extra to access it, it need only degrade (or maintain current) quality of service to nascent A (read: Skype, YouTube, BitTorrent) to the point that it is less useable. This is neither a new limitation (from the consumers perspective) nor an explicit fee. If one a user suddenly lost all access to 90% of the internet, the last-mile carrier could not keep their business (or at least price). But, discrimination won't look like that. It will come in the form of improving video services for providers who pay. It will come in the form of slightly lower quality Skyping which feels ever worse as compared to CarrierCrystalClearIP. It will come in the form of [Insert New Application] that I never find out about because it couldn't function on the non-toll internet and the innovators couldn't pay up or were seen as competitors. As Barbara van Schewick observes, carriers have the incentive and ability to discriminate in this fashion.

Finally, Lee makes the argument that the current norm of "settlement-free" peering in the backbone of the internet will restrict last-mile providers' ability to discriminate and to create a two-tiered internet because they will be bound by the equal treatment terms of the agreements. This is not supported by practical evidence, given the fact that none of the push-back against existing discriminatory practices has come from network peers. It is also not supported by sound economic reasoning. It is certainly not in backbone-provider E's business interest to raise prices for all of its customers (an inevitable result). But, assuming E does negotiate for equal terms, the best-case scenario is that E becomes a more expensive "premium" backbone provider by paying monopoly rents to last-mile provider D, while F becomes a "budget" backbone provider by opting out (and hence attracts the "budget" customers).

We are already seeing cracks in the dam of settlement-free peering. The Cogent/L3 meltdown happened between two backbone-only providers and was in the context of volume-based disagreements. Two weeks ago, Sprint disconnected from Cogent because of a dispute over sharing. When you add the only recent pressure of last-mile leveraging and discrimination-based disagreements, these dynamics are troubling. Lee is making the case that history is on his side, but he doesn't have much supporting history to draw from. Common carriage prevented last-mile discrimination until 2005. Kevin Werbach, on the other hand, sees major risks from emerging market power, specialized peering, and what he calls possible "Tier 0" arrangements between vertically integrated providers. The Verizon/MCI/UUNET network was only recently unified, creating something close to this type of an arrangement.


Conclusion

Tim Lee's article repeats but then goes beyond the standard refrain of no-government-regulation libertarianism. However, his novel arguments for why the internet will take care of itself are not persuasive. Ultimately, we are left with his well-put argument for the benefits of network neutrality, but without any assurances that it will be preserved. Into this vacuum might flow reasonable discussion of how targeted government regulation might be the only means of achieving the ends we both seek.

Wednesday, November 5, 2008

A Good Day for Openness

Nov 4th brought some exciting developments for openness in several different domains.

First, we elected a president dedicated to government transparency and accessibility. I hope that Obama's "Google for Government" bill is a harbinger of things to come in his administration. Making more information freely available and searchable will allow the better functioning of our government.

Second, a slightly more wonky development. The FCC approved unlicensed use of the "white spaces." This is the culmination of a 4+ year-long process, with heavy lobbying in the past year or so. It opens up huge swaths of spectrum, which any citizen or innovator can put to use for things like wireless broadband.

Third, a geeky development. Somebody rooted the G1 -- the first handset based on the open-source Android operating system. Although the operating system itself is open-source, T-Mobile had locked down all of the interesting stuff. Now that it's unlocked, we will likely see a plethora of interesting development on the platform.