Thomas Hazlett on Telecommunications
Nov 24 2008

Thomas Hazlett of George Mason University talks with EconTalk host Russ Roberts about a number of key issues in telecommunications and telecommunication policy including net neutrality, FCC policy, and the state of antitrust. Hazlett argues for an emergent, Hayekian approach to policy toward the internet rather than trying to design it from the top down and for an increased use of exchangeable property rights in allocating spectrum.

RELATED EPISODE
Thomas Hazlett on Apple vs. Google
Thomas Hazlett of George Mason University talks with EconTalk host Russ Roberts about the growing rivalry between Apple and Google. It is commonly argued that Apple with its closed platform and tight control from the top via Steve Jobs is...
EXPLORE MORE
Related EPISODE
Don Boudreaux on Market Failure, Government Failure and the Economics of Antitrust Regulation
Don Boudreaux of George Mason University talks with EconTalk host Russ Roberts about when market failure can be improved by government intervention. After discussing the evolution of economic thinking about externalities and public goods, the conversation turns to the case...
EXPLORE MORE
Explore audio transcript, further reading that will help you delve deeper into this week’s episode, and vigorous conversations in the form of our comments section below.

READER COMMENTS

Martin Brock
Nov 24 2008 at 12:26pm

The Microsoft story omits some relevant details. IBM presumably designed the IBM-PC largely from off-the-shelf components and outsourced development of the operating system (OS) largely because of the ever looming anti-trust action. Had the PC hardware or OS been designed and developed otherwise, there might never have been any “PC clone” market, as there was never a viable Apple clone market, so the anti-trust suit presumably did bear fruit, and Microsoft is one of its fruits.

The IBM-PC architecture essentially became an industry standard for personal computing hardware requiring little licensing from IBM. This standard hardware architecture, and freedom to adopt the standard without a license, made a standard operating system for this hardware far more valuable, and since Microsoft’s OS was delivered with the hardware, it had a practically insurmountable advantage early in the game.

On the other hand, my first personal computer was a Zenith Z-100 that ran both Microsoft DOS (which was then distinct from IBM-PC DOS, also produced by Microsoft) and another operating system called CP-M (Control Program for Microcomputers). CP-M preceded Microsoft DOS and was more popular among early microcomputing geeks. The Z-100 was not a “PC clone”, because the hardware was very different. This computer still rests peacefully in my attic, so I remember the early competition for both PC hardware and software standards well.

Furthermore, Microsoft strove for a very open application development standard early in the game, largely because competition for the operating system did exist. Microsoft could easily have made application development very difficult, requiring a costly relationship with Microsoft to develop applications for the OS, but Microsoft didn’t pursue this strategy. If it had, we might be using some OS from Digital Research today.

Lincoln McLain
Nov 24 2008 at 3:02pm

huh… Gotta love new fruntiers of growth. This reminds me of the 49ers Not the football team the actual gold seeking rough ‘n tough bearded men who charged out west to stake a claim. Essentially a website is a staked claim.
The 49ers eventually found out that land out west was something closer to a zero sum game then they first thought. Is this true for the internet too? is there a continum that will be reached and act as a natural constraint to growth, therefore supplying the needed parameter to drive up prices for space on line. It seems like the webproviders are trying to senthasize this type of constraint without the natural parameters to support such a motion. I’m inclined to think that competetion will push the margin and allow for entrepreneurs to step in and overtake a company that restricts access to certain websites. I gues this leads to another question… Is the web public property? As a tax payer do I have the write to go where I want to go on the web? This paradigm would treat the web providers as a means of transportation to and from the “web” location which is public domaign.
So much cool new stuff to talk about. I love it. Thanks Russ and Tom.

NormD
Nov 24 2008 at 3:07pm

Economists like Hazlett always miss the incredible value of unlicensed radio bands, that no one owns. Think of all the RF devices that you use day-to-day that would be impossible if someone “owned” the spectrum. Car door openers, baby monitors, WiFi, home automation, Radio controlled toys, garage door openers, wireless home security systems, Wireless phones (not cell phones), remote AV equipment. It goes on and on. I would guess that far more entrepreneurs and inventors are working in the unlicensed bands than any licensed band, which cost billions to acquire.

I wonder how much spectrum that is owned is going unused. I suspect quite a bit.

Russ Bankson
Nov 24 2008 at 5:38pm

A suggestion: when providng references to articles, it would be useful to cite a source that does not require special affiliation or fees eg jstor

Here is an alternative source for the Coase article http://www.sfu.ca/~allen/CoaseJLE1960.pdf (google search)

Russ Roberts
Nov 24 2008 at 9:12pm

Russ Bankson,

For better of worse, we only link to sources that respect copyright. As fate would have it, many fine articles are under copyright protection. If you are affiliated with a university, you may have access to JSTOR. If you do not have access to JSTOR, feel free to surf the web.

Ajay
Nov 25 2008 at 1:30am

A problem with this podcast is that Hazlett is so steeped in this stuff that he never bothers to give layman’s definitions. As a result, I think most of it will go flying over many people’s heads. For example, he never properly explains what it means that google owns their own physical transportation networks. Comparing the network to its closest analog in real life, the highway system, google owns some major highways of its own and can send trucks with its search results faster, because they aren’t limited to the commonly used highways. Most software companies just buy access to highways from AT&T and Verizon and send their trucks using these occasionally congested roads.

I think Russ is wrong about Apple being successful with their closed model. It seems Steve Jobs’s fate to continually build closed systems, that may work initially when seeding new technology because they’re more fully integrated, that are eventually overrun by open systems. Hazlett is clearly wrong about Apple computers being more immune to malware. Most serious observers know that they’re equally vulnerable to malware but the malware authors don’t bother trying to infect the niche Apple platform. Just like application developers, they aim for the much bigger Windows market.

As for antitrust law, current oversight seems to be an artifact of a century ago when behemoth trusts could corrupt government and overrun the economy. Today, when no company’s revenues are more than 3-4% of GDP and capital flows freely, antitrust law seems to be unnecessary except for the extreme possibility of giant vertically-integrated mergers, say a GE/Shell/MicroSoft/JPMorgan Chase. As for cable monopolies today, there are basically no economies of scale after the recent digital transition. It is a running joke that the telephone companies had to open up their monopoly networks to competitors in 1996 (although that move was botched and has now been slowly clawed back) but the cable companies don’t have to open up, which was ratified by the Supreme Court in their dumb Brand X decision which Scalia rightfully lambasted.

LowcountryJoe
Nov 25 2008 at 9:21am

Ajay, while the word “truck” certainly fits the analogy of a network’s highway better [and maybe this word has emerged as the better word in its recenr usage], I think that it’s called a “trunk”. At least it used to be called a trunk when I worked in the telecomm/datacomm field 10 years ago.

Richard Sprague
Nov 25 2008 at 10:06am

I disagree with Martin Brock that the anti-trust suit against IBM had anything to do with Microsoft’s success. IBM was not forbidden to develop its own PC operating system, but contracted it to Microsoft purely for time-to-market reasons. Other mainframe makers at the time (e.g. HP) introduced microcomputers based on proprietary architectures that never took off because of classic innovator’s dilemma reasons.

In hindsight it’s easy to look back and see that OS decisions were important, but at the time it wasn’t obvious, since there was innovation on many dimensions and IBM was far behind.

Adam
Nov 25 2008 at 10:15am

One of the big arguments about Net Neutrality that I didn’t hear Hazlett address was the notion that technology today has evolved to the point where an internet service provider could conceivably allow some websites to be accessed more quickly (allot them more bandwidth)–the NN folks are afraid this would mean the big guys would pay for a lot of bandwidth and this would stifle the possibility for any little guys to come in and challenge the players already entrenched in the market.

I address this argument specifically here.

A must read also is “Net Neutrality: What’s a Libertarian to do?” By Eric S. Raymond.

Jonathan Nation
Nov 25 2008 at 10:26am

I have loved every EP of Econtalk that I have listened to, even the Happyness one that made my head spin and hurt was enjoyable.

I question the knowledge of the current debate of “net neutrality” of the guest.

In the interview, he sited an example of non-net neutrality as google securing the contract to be the search engine on the AOL hompage. A non-net neutrality event would be AOL preventing their subscribers (the people who use AOL dial-up or AOL Broadband) from using any other search engine, or charging Yahoo/MSN/etc be able to access their subscribers.

The story he told was basically a different part of AOL(the program/webpage side) having a contract with them for the search box.

Now, it could be he just did not explain it very well, but this makes me believe he either does not understand why people want net neutrality, or he is just a partisan who is throwing mud to help his cause.

I can also say that through this EP I had a thought about the topic of Net Neutrality and my mind was changed from the automatic “must have” to a “second way to deal with the same issues”.

Instead of forcing the idea of net neutrality as law, what about a separate law:

– each company must set their own rules of how they act & make them public to the customers.

Basically transparency. This allows the buyers or customers decide what is really important to them. If we trust the people, and trust the market, then this is a better option. Just as T-mobile has their pick 5(pick 5 people who you can call all you want) is a marketing item, you could have ISPs that say “we are net neutral, go wherever you want, not just to those who pay us”.

The other issue I know I had was when he claimed there are 4 competitors with cable.
In my area there are at most 2, the traditional cable company & satellite. I hope for a phone based option soon, but they have not built that out yet. At that time we will have 3 and satellite is so inferior in our area that we really will only have 2 and have 1 right now, it’s just a technical issue that satellite is so inferior that it is not a valid option.

Barry Kelly
Nov 25 2008 at 1:45pm

Russ,

I listen to your podcast every time it comes out, and most of the time, I come away knowing a little more about the topics at hand than I did going in. I’m aware that you’re very susceptible to bias confirmation, though. It’s actually a little creepy, like listening to a member of a cult. In any case, most of the time, I can get past that and learn something.

This show, however, relates to something I know about. I’m a software engineer by profession, and as a result, I’m acutely disappointed and not a little horrified at the untruths, omissions and laughable misconceptions supposedly held, but certainly promulgated, by the show’s guest.

I’ll start somewhere near the beginning.

Hazlett asserts that the net-neutrality promoters say that the Internet was “created according to particular blueprints”; he goes on to say that in fact, the Internet evolved.

The problem with this straw-man position and its untrue knockdown lies in using two different meanings for the word “Internet”, commonly denoted informed circles by talking about either an internet (small “i”) or the Internet, and in particular, the web.

An internet is a larger network formed by connecting two or more heterogeneous networks. To do this, you need something called a protocol: a language that both networks can speak, defined in a standard. This is *not* something that evolves. It has to be designed. Independent networks, left to their own devices, are never going to magically start intercommunicating. In any case, the most common internetworking protocol is called Internet Protocol, usually shortened to IP.

IP is an unreliable, connectionless, packet-switched protocol. It only makes a “best-effort” attempt to deliver packets to their final destination. It’s a very thin wrapper over the underlying wire network protocol. Many applications, however, need more reliability, so they normally use a different protocol: Transfer Control Protocol, aka TCP. TCP is built on top of IP, and is a reliable virtual-circuit protocol. It makes sure packets from IP are put in the right order, and that missing packets are accounted for and resent. Hence we have TCP/IP (“TCP over IP”).

There is another common protocol, called UDP (User Datagram Protocol), which is basically the raw IP but with a few extra bits of information to permit more than one application to multiplex over the same underlying connection (UDP ports, in other words).

This (IP with TCP&UDP) is the thing that is designed. The interesting thing about its design is that it doesn’t have any affordances for any particular applications: there’s nothing in it that makes HTTP (i.e. the web) work, nothing about VoIP, nothing about multiplayer games, or instant messaging, or any applications at all.

In other words, it is *neutral* with respect to the application being run over the network.

This *neutrality* is what permits *evolution*: because the network hasn’t had applications design into it, *new* applications can make use of the existing network infrastructure and put it to new purposes.

This is how the web came about. The Internet has been around for far longer than the web; there was email, and newsgroups, and Gopher servers, etc., before there was ever HTTP and HTML, the things that make the web work. And it’s this that makes Hazlett’s counterargument actually false: because he’s clearly using the word “Internet” to stand in for things like the web, because that’s what the Internet *means* to the *layman*. And that’s also why it’s so despicably misleading.

Just because A is designed and B has demonstrably evolved, does not prove that A has also evolved, even if the same word is used for A and B.

Now, to put network neutrality advocates’ arguments in a nutshell, what they want is to maintain this existing protocol neutrality, so that Internet applications can continue to evolve.

Here’s an example fear that some NN advocates have: telecommunications companies specialize in voice communication and have for over a century; thus, they have good reason *and* business competence *not* to want VoIP to succeed, in particular, VoIP provided by third parties. Now how could they do this? Well, most voice communication providers are also Internet service providers these days. All they need do in order to break the network’s neutrality is to de-prioritize third-party VoIP packets and prioritize their own VoIP packets (if any).

Because IP is an *unreliable* protocol, discarding e.g. 10% of VoIP-classified packets would be “OK”, as in still being a valid implementation of IP, but it would result in stutters and distortion in the third-party’s voice connection, depending on how well their protocol is implemented. (And if it’s still too good, well, we can turn up that packet-loss dial a little higher.) Without either net neutrality, or another mechanism such as enforced transparency (as Jonathan writes), customers are only going to think that the third-party’s product just doesn’t “work right”.

Back to the show. Hazlett then goes off on a non-sequitur about Google buying default placement from AOL. This is about defaults, though, and it has nothing to do with network neutrality. Network bias (as opposed to neutrality) would be having random timeouts, or very slow loading, or plain inaccessibility for http://www.google.com but not e.g. http://www.myisp.com/search.

Later on, Hazlett talks about the MSFT anti-trust case. However, he has either deliberately omitted, or doesn’t understand, the underlying dynamics at play in the web browser market.

In so far as the web works at all, it works because web sites (hosted on servers) and web browsers (clients) have an agreement on how to transfer and display and navigate between web pages on the client screen. The transfer protocol is HTTP; the display standard is HTML(+CSS+…); the navigation standard is URLs.

The fact that there is a common standard required means two things: in order for the standard to evolve, both sides must move forward in tandem; and secondly, if the standards become prohibitively expensive to implement from scratch (usually because of complexity, but could be patents, etc.), the entry barrier will protect the dominant players from competition. If there’s only one dominant player, they’ll have a monopoly.

Microsoft used their existing OS monopoly to try and control all the client software, and they then extended their browser with ActiveX, DHTML, VBScript, XMLHTTPObject, and various other non-standard (at the time) extensions in the hopes that web developers would take advantage of them, and thus force the de facto standard to be more complex and tied to Microsoft’s browser.

And it worked, to a certain degree. It forced Netscape effectively out of the browser business, as a business. However, Netscape made a final last-gasp step of open-sourcing their browser.

Here comes the other side of this dynamic. Developers aren’t stupid. They’ve known Microsoft’s dominance of the OS for a long time, and they were wary of ceding the new field of the Internet to MS (again). For as long as it was possible to browse the web with alternative browsers, there was still an important and useful chunk of the web that didn’t use much in the way of IE-only capabilities.

Furthermore, because the only way to move the protocol forward is to have buy-in on all sides, progress was slow: web developers didn’t take advantage of many IE capabilities because they couldn’t rely on all clients having the latest version of IE (big downloads were a bigger factor back then).

So no matter how MS innovated and tried to encourage people to develop ActiveX browser plugins (and they were slowed by the antitrust case), they couldn’t turn it into a platform as complex as their OS before the return of Netscape’s browser reborn in open source, Mozilla, and in its most successful packaging, Firefox.

This dynamic, and the relentless growth of the web, has had other effects. The web has indeed become more like a platform, the very thing that MS feared, and it’s not a platform they control, because it has standards and viable competition. It’s precisely *because* people do most of their computer work via the web browser and an office suite that people these days have the luxury of being able to switch to a Mac. Without the open standards-based Internet, MS would still be relatively unthreatened.

Now, we’ve nearly gotten up to the point where I stopped listening to the show – I had to stop and start writing out all the things that were wrong or missing.

I stopped listening after the smug ingroup bias-confirming to-and-fro you pair had about both being Mac users and attributing freedom from malware to *closed* design.

Did either of you ever wonder why Linux doesn’t suffer from malware either? Do you think it’s because Linux has a closed design?

On the basis of this conversation, it seems likely that you aren’t actually aware that Linux doesn’t have a closed design. It is a lot more open than MS Windows.

Windows’ susceptibility to malware comes from two sources: its prevalence (still >95% of all computer OSes worldwide) and its backward compatibility. 32-bit Vista is backward compatible with DOS applications written in the 80s, and 16-bit Windows applications written in the early 90s. DOS and 16-bit Windows, as well as the mixture of 16 & 32 that was Windows 95/98/ME, all had a peculiar nonfeature: they lacked access control support in the operating system.

Thus, applications had a free reign on the OS, and could modify files and running applications with freedom. Thus, viruses and malware had an even easier time of it.

By the time of Windows XP, which inherits its access control logic from NT, which in turn is inspired by VAX VMS, a minicomputer OS, there were too many (harmless) applications that assumed they had (relatively) unfettered access to the filesystem, at least.

Vista has changed much of this by requiring user prompts, and it was painful, and Vista’s reputation has suffered in large part because of it.

Well, that’s the end of my rant. Hopefully your next conversation will be on a topic that at least one of the participants knows something about.

It’s either that, or it will be about a topic I don’t know much about, and thus I’ll end up being hopelessly misinformed about.

I do hope *that* doesn’t happen.

Thomas Hazlett
Nov 25 2008 at 3:12pm

Russ invites me to respond to some of the points made in the Comments, and I’m happy to do so.

Martin Brock– I agree with you except for the assertion that the IBM antitrust case was the driving force in moving the once-dominant computer firm to commission the PC operating system from the upstart Microsoft. As Richard Sprague notes, and as many accounts state plainly, the IBM issue was time-to-market. The firm could not internally generate an operating system for the PC market in under a year, and outsourced the task to a smaller, nimbler firm. That is the standard explanation; evidence to the contrary would be of interest.

NormD states: “Economists like Hazlett always miss the incredible value of unlicensed radio bands, that no one owns.” Wrong on both counts. A quick visit to my website would reveal that this economist (who is “very like Hazlett”) has spent considerable time investigating the relative value of alternative property rights regimes in radio spectrum. Try this link: http://mason.gmu.edu/~thazlett/papers.html

What you’ll learn from several of the studies there (other sources cited, of course) is that the unlicensed bands are not assets “that no one owns.” Rather, they are administratively allocated — effectively ‘state property.’ That means that regulators set usage rules. Most important are the technology constraints and power limits that mitigate conflicts. These constraints are hardly free. It explains, for instance, why you buy your mobile phone service from a wireless carrier using licensed spectrum, when there is no legal rule against those same carriers, or their competitors, selling you wireless phone service using unlicensed bands. The alternative to this coordination scheme is to define spectrum ownership rights and to delegate spectrum sharing rules to competitive asset owners in the marketplace. That is basically the approach now pursued in mobile telephony and WiMax bands.

In the empirical work done on this question, the social value delivered by de facto private property rights in spectrum far exceed alternative ownership regimes, including the traditional licensing regime in broadcasting (where, again, regulators define technologies and services) and the unlicensed bands. In a “Big Ideas About Information” Lecture given Nov. 14 at George Mason University, Ofcom (UK) regulator William Webb noted that recent studies done in Great Britain show that liberally licensed spectrum, used by mobile phone networks, generate an estimated fifty times the economic value of unlicensed (“license exempt”) bands. See slides here: http://iep.gmu.edu/

Categorical assertions that most innovation occurs in the unlicensed bands and are pre-empted where exclusive spectrum rights prevail are easily made, but are also easily proven incorrect. The applications hosted in unlicensed bands, such as wifi routers, cordless phones, and remote controls, are short-range devices requiring little in the way of complex network coordination. They also ‘ride’ on wide area networks that are constructed using private property rights to spectrum — wifi connections enable “cordless PCs” just as cordless phones extend the reach of fixed phone nets, but the underlying networks are far more sophisticated, complex, expensive, and organized with exclusive ownership rights.

And one can ask: if the innovation is all in the unlicensed bands, why are all the billion-dollar networks and multi-million-seller “killer apps” in wireless — the “Crackberry,” the iPhone, and now, Google hopes, the G-Phone — all relying on wireless wide area networks organized via licensed spectrum? What are, exactly, the mass market applications that have been developed in unlicensed bands since wifi (1997-98)?

Ajay — says I’m too technical in my jargon, to which I respond: that is a POTS comment in a PANS world. And, then, apologize.

As for Ajay’s assertions that “there are basically no economies of scale [for cable TV networks] after the recent digital transition,” and that cable “open access” rules should have been applied to cable systems – despite the Supreme Court’s Brand X decision in 2005 — I could not disagree more. First, there is strong evidence that there are large economies of scale in broadband networks generally. That is seen in U.S. markets and abroad. Second, and more importantly (because the scale issue is not a regulatory constraint or policy matter) “open access” regulation has not improved broadband market performance nor has it aided consumers. This was the opinion not just of the Supreme Court, but of both the Clinton-era and Bush-era FCCs. In this instance, they are strongly supported by marketplace evidence.

When DSL networks were heavily regulated prior to 1Q2003, meaning that the phone carriers were forced to share their networks with 3rd party providers (like Covad) at favorable wholesale prices, unregulated cable broadband subscriptions outpaced DSL subscriptions nearly two-to-one. When DSL was substantially deregulated in the first quarter of 2003, the growth of DSL took off, relative to cable (and adjusting for other factors). By year-end 2006, there were 10 million more DSL households than the pre-2003 trend predicted (25 million to 15 million), while cable modem growth stayed basically on course. My paper, with Anil Caliskan, on this is published in the June 2008 Review of Network Economics and available to read here: http://mason.gmu.edu/~thazlett/documents/RNE_Hazlett_Caliskan_Final.doc

Adam and Jonathan– On the question of Net Neutrality, the above Hazlett-Caliskan paper on Broadband Regulation offers a detailed discussion of the incentive effects of the policy discussed. In many other articles, and in the Podcast, I referenced some of the ‘non-neutral’ aspects of the Internet which are entirely progressive in encouraging the coordination that brings investment, innovation, and new efficiencies to the market.

In response to the point that certain non-neutralities are not outlawed by NN regulation: the contours of legality are unknown and unknowable until actual rules, which scan a wide range in the policy debate, are adopted and then interpreted in actual enforcement actions. One important non-neutrality I’ve cited is the very high degree of vertical coordination in the DoCoMo wireless phone network in Japan, where the ISP (DoCoMo) extracts revenue from content providers. That appears to be what people in the US mean by non-neutral; it is also, by consensus, the most successful mobile data platform in the world.

Another non-neutral Internet structure I often cite is that cable TV operators ‘wall off’ cable modem bandwidth for the exclusive use of their ‘digital phone’ customers, supplying superior (‘carrier class’) voice service than is obtained by customers using cable modem links with Vonage, Skype, or other 3rd party VoIP vendor. I take this to be a clear violation of the NN regulation espoused by champions of the policy. It is also an evidently pro-competitive efficiency, as it assists an emerging rival to incumbent telephone carriers. Indeed, the build-out by cable operators of just such a fixed line alternative to ‘plain old telephone service” provided by the legacy carriers — now available to over 90% of U.S. households — is a gleaming success story in pro-consumer policy. And it rests on what could be illegal under a serious NN regime.

Barry Kelly– The long and fairly hostile rant is premised on the notion that a software engineer understands the Internet, whereas an economist, or at least this economist, does not. Readers get to make their own judgments here. But having dealt with such debates between technical experts and policy experts for better than two decades, my belief is that (a) both sides should try to understand each other, and (b) either one operates in isolation at its own peril.

Kelly dubs the “blueprint” argument as a ‘straw man’ when, in fact, that assertion lies at the heart of the end-to-end defense of NN and is embedded in the application neutrality point he embraces. To the degree that applications are treated neutrally by interconnected networks, that model has evolved spontaneously, not by design. The NN regulation question is then one of economic process and the legal conditions that host the development: should we now impose a particular characterization of that mode of transit by law, or should we allow the ‘layers’ of the Internet — physical layer, logical layer, and applications/content layer — to develop as they have? One can list many ways in which neutrality in application transit has proven unworkable, and where prices have been imposed to create better economic incentives. One can also cite a natural progression away from “neutrality” — as MIT’s David Clark does in his important 2007 paper (with co-authors, found here: http://people.csail.mit.edu/wlehr/Lehr-Papers_files/Clark%20Lehr%20Faratin%20Complexity%20Interconnection%20TPRC%202007.pdf). He notes that the efficient evolution now underway is for “large eyeball networks” like Verizon, AT&T and Comcast to charge the “large application networks” in order to undertake the relatively costly expansion of network capacity out to residential users. This is a fundamentally different development task than faced earlier, when dial-up access was cheaply laid onto legacy phone networks, and requires a different pricing scheme.

The example that Google resorts to non-neutrality in paying for default positions on web browsers at AOL, or Opera, or on the New Clearwire mobile phone (a wireless ISP in which Google is a part-owner and so receives its own button on the handset) is a good example of discrimination, and is buttressed by many others, including the bundling of physical transport (via a proprietary network) with search applications which Google also provides so as to produce a competitive superior search product. It is important to note that the modular nature observed widely in “the network of networks” is of a piece with other markets. There is no special “end to end” principle in the Internet that solves an economic riddle: how best to regulate business relations between complementary suppliers. Markets quite generally iterate on efficient vertical structures (the NN question as to how transport networks should be allowed to supply, or price access to, complementary services).

As for the Microsoft antitrust case, the DOJ’s theory was not that Netscape was being crushed and that Microsoft was thereby eliminating a competitive browser. It was that the JAVA language embedded in Netscape would provide a platform for software applications that would “commoditize” Windows. To protect Windows, MSFT fought Netscape Navigator to extinguish JAVA. In any event, the Government, which won the case, never did see its theory play out. Java never proved the substitute for Windows that that was asserted. Although, quite independently of the case, Mac OS, Linux and other Windows rivals are still alive and well.

TWH

NormD
Nov 26 2008 at 2:38am

Economists like Hazlett always miss the incredible value of unlicensed radio bands, that no one owns. Think of all the RF devices that you use day-to-day that would be impossible if someone “owned” the spectrum. Car door openers, baby monitors, WiFi, home automation, Radio controlled toys, garage door openers, wireless home security systems, Wireless phones (not cell phones), remote AV equipment. It goes on and on. I would guess that far more entrepreneurs and inventors are working in the unlicensed bands than any licensed band, which cost billions to acquire.

I wonder how much spectrum that is owned is going unused. I suspect quite a bit.

Russ Roberts
Nov 26 2008 at 10:35am

I want to thank all the people who have commented so far, especially Tom Hazlett for his detailed response.

I also want to remind commenters to keep it civil. Insulting comments that are devoid of content will be removed.

John Rosendahl
Nov 26 2008 at 1:35pm

Russ,
One of the things I have always enjoyed about your podcast is your willingness to interview people with opposing viewpoints in an exploratory as opposed to hostile manner.

I agree with Mr. Hazlett that “both sides should try to understand each other, and (b) either one operates in isolation at its own peril.” But it is not that technical people disagreeing are unwilling to admit that an economist can understand the internet, but that Mr. Hazlett’s arguments themselves demonstrate a lack of understanding.

For instance, there is a difference between AOL recommending a service and AOL forbidding a service.

Net-Neutrality initiatives are trying to find ways to prevent large incumbent companies from destroying innovation and competition by restricting access to markets. How to accomplish this and if it is a good thing to do (or even possible) is up for debate.

I would really like to hear more on these issues and encourage you to conduct an additional interview about the subject. Tim Lee who wrote this paper http://cato.org/pub_display.php?pub_id=9775 could be a great interview. As would some of the authors at Ars Technica which provides relatively non-technical (but technically accurate) coverage of NN issues. This link href=”http://arstechnica.com/search.ars provides a list of recent NN related articles.

Grant
Nov 26 2008 at 2:49pm

As another computer engineer, I partially agree with Barry Kelly: Thomas Hazlett doesn’t display a deep understanding of what NN is, or what the Internet is. I’m not saying he doesn’t understand it – but its just not displayed in this podcast.

I’m about as pro-market as one can get, but even I admit it is possible for government to improve the Internet. However, the question is whether or not government will improve the Internet, if given the chance.

Is p(government screwup) less than p(market screwup) in this case? As Thomas Hazlett points out, there are plenty of good reasons we don’t want the Internet to be completely neutral (BitTorrent comes to mind here). The government would have to distinguish between areas where we do want IP to be neutral, and areas where we don’t. Much of this is very esoteric to your average person, and thus is (IMO) likely to suffer from regulatory capture.

IMO, people arguing for NN not only have a solution in search of a problem, but are overestimating the risk of market failure when compared to political failure.

By the way, Most of Modzilla’s profits come from Google paying them to be the default search engine on Firefox. Without that, we may not have a Firefox browser (and IE would most certainly be far worse as a result).

BoscoH
Nov 27 2008 at 3:20am

I agree with Professor Hazlett’s comment above that software engineers ignore economists at their own peril (and vice versa). One interesting point Hazlett brings up is the peering arrangements between large ISPs. Last month, I found myself unable to connect to my main website using my Sprint broadband card, but still able to connect via Cox cable. I read on The Register about Sprint’s disagreement with Cogent:

http://www.theregister.co.uk/2008/10/31/congent_sprint_spat/

Sprint disconnected Cogent because Sprint felt that Cogent violated its peering agreement, which allows the two ISPs to exchange roughly the same amount of traffic without having to pay each other.

I called Sprint tech support and reported the problem. Sprint tech support claimed ignorance. I didn’t let on to what I knew about it. Sprint called me back two days later, after the connection had been restored to make sure I knew they had fixed the problem and to make sure I was still on board as a customer. I imagine this interaction was repeated thousands of times that weekend with varying degrees of irritated customer.

Clearly what Sprint did would have been a violation of NN principles and very likely, future NN legislation. But despite their strong property right under current environment of very light regulation, they caved pretty quickly. Had it gone on another couple of weekdays, I simply cancel my month-to-month contract and spend $100 on a Verizon USB or ExpressCard for my Mac and jump to their network. Other Sprint customers would do the same.

Despite the inconvenience, I think it’s far better that companies be able to play these issues out in the marketplace than in front of regulators. Both companies had to pipe down significantly and quickly to avoid losing customers. In a highly regulated environment, they can battle and battle and let the regulators decide whose customers lose. Say the regulators decided that Cogent should pay Sprint for the peering, and Cogent doesn’t want to pay? Suddenly, that portion of the Internet that I need is permanently useless, and I have to switch wireless ISPs. Or say the regulators side with Cogent but Sprint is losing money on the deal. When Sprint refuses to renew the contract, I’m screwed again. NN advocates who can’t see the problem with that are worse than naive. Companies will just behave like the old RBOCs and please the regulator rather than pleasing the customer.

Zooko Wilcox-O'Hearn
Nov 28 2008 at 11:32am

Barry Kelley is right that Russ Roberts and Thomas Hazlett confirm one another’s biases too easily, and Barry Kelley is right that Hazlett’s version of history is flawed, both in facts and in interpretation. (E.g.: Mac OS X is not more closed in *any* meaningful sense than Windows XP or Vista, and the relative prevalence of malware on Windows is not due to any such openness. “OS 1” is not recognized by me — perhaps he meant OS/2?)

On the other hand, Hazlett is clearly very right (in the comments) to say that economists/policy wonks and engineers ignore the other’s viewpoint at their peril. Engineers do of course have their own biases which their own interpretations of history tend to confirm, and indeed they often can’t see the forest for the trees. (But in this case Hazlett seems to be wrong about several of the trees.)

On the gripping hand, as Roberts and Taleb have reminded us, ex post facto story-telling is too easy, and its tendency to confirm our own biases and affirm our own values is too seductive. Like Barry Kelley, this episode of econtalk made me worry that previous episodes from which I learned something new were actually teaching me falsehoods and biased interpretation. Fortunately, reviewing some of my favorites, it appears that most of them were not primarily ex post facto storytelling.

Ed
Nov 28 2008 at 5:54pm

I enjoyed the podcast. I’d like to see more discussed about network neutrality. I hear a lot from the pro-nn side. The fact that the major companies, the Democrats, and the media are pushing for this gives me pause.

I enjoyed the spectrum discussion. I found Hazlett’s argument for a property rights approach for spectrum allocation interesting. I’ve heard about the white space that may be available, but this is the first time I’ve heard someone explain why it’s over-rated as something other companies can use to challenge the telephone and cable companies. Very thought provoking I would say.

Also, on the malware issue, Apple’s OS probably has vulnerabilities but since it’s not widely used, it’s not widely targeted.

Eva
Nov 30 2008 at 3:58am

I’m surprised that NN is not supported by Hazlett or Roberts.

One of the mayor pushes against NN stems from the attempt to punish P2P filesharing. The ISPs are but a tool of government control (which in turn is but a tool of the evil media industry).

Personally, I find it worrying that it is considered the better option to leave big telco conglomerates in charge of the proverbial pipes.

How’s this for an analogy: We expect the media to provide information, and certainly we all agree that information should be reasonably unbiased. Why not have the same expectation for internet providers?

As a user, I want to be able to choose which content I suck down the pipes. Not supporting NN is, IMO, the same as supporting censorship.

David Robson
Dec 1 2008 at 4:23pm

I fear that Prof. Hazlett missed a very important point that Barry Kelly made. Namely that the non-neutral position that Google pays does not restrict choice, but only assigns default user preferences. Should I decide not to use Google as the default search, I can change that, or access the Yahoo site instead. It may be less convenient, but at least the options are there, available and not biased against, as Kelly puts it:

“Network bias (as opposed to neutrality) would be having random timeouts, or very slow loading, or plain inaccessibility for http://www.google.com but not e.g. http://www.myisp.com/search.”

Such blocking is much more of a concern to me, and I’d be interested how Prof. Hazlett might address that issue.

Van Theodorou
Dec 3 2008 at 9:18am

I just do not think government getting involved like this is a good idea. We always have to look at the unintended consequences of too much government intervention.

Chris
Dec 4 2008 at 5:23pm

By analogy, consider regulations on automobiles: you cannot drive a car on public roads unless it meets all sorts of federal guidelines. Without that rule, you would see all sorts of crazy things on the roads. Some of those things would be really cool, and some would be bizarre and others would be just dumb. But, without the rules, only well-funded companies pay the costs of the regulation and put cars on the roads.

The Internet is the same way: because each community only has 1 or 2 broadband providers, those providers have the ability to charge for the traffic as well. As a result, only well-funded companies can get their traffic onto the Internet.

Up until now, the Internet has been full of those crazy cars, some of which turn out to be really cool. The concern is that, without net neutrality, all that innovation will be lost.

(Despite there not being an official net neutrality law now, ISPs tread very lightly due to FCC slapping down a few ISPs.)

Ak Mike
Dec 5 2008 at 10:01pm

I am fascinated by the inability of many of the commenters above to understand the underlying message of Prof. Hazlett’s presentation – which is that, although it seems counterintuitive to many, regulation hurts consumers by stifling competition, crushing innovation, and repelling investment.

Barry Kelly and a number of others focus on the insignificant distinction that the AOL example concerns a default preference rather than a net non-neutrality, while completely missing the important point that it benefits consumers if the ISP plays favorites by making some providers more convenient and some providers less convenient. (Which, of course, is the complaint about Net non-neutrality.)

There is something in the temper of the times that makes it impossible for a seemingly intelligent poster like John Rosendahl to understand that regulation is far more likely than leaving the pipes in the hands of corporations to result in the stifling of innovaton. John – the only way the incumbents can restrict access to markets is by getting regulators to make it illegal to compete with them.

Even here in remote Alaska, we have three or four broad band Internet providers. If one of them is choking off a large part of the Web, I’ll switch and go with a different one. If three of four do that, everybody will switch to the fourth. If all four should cut off the open Web, a new provider will get the investment capital to come in and make a killing providing it. Market forces will provide the consumers what they want. Isn’t that obvious?

But if the regulators get in, they will be captured by the existing big companies and make sure that no new providers come in, that the existing providers will not have to invest in innovative technology, and that the same old thing will go on and on.

Daniel Klein
Dec 6 2008 at 8:32pm

Great program.

I’d love analysis of why Google, Microsoft, etc. are pushing in the directions they are.

johng
Dec 8 2008 at 12:09am

Isn’t net neutrality the same thing as the idea behind existing “common carrier” laws? Trucking companies, airlines, railroads, and the post office cannot charge different rates just because they are envious of the profits of some of the entities using their services.

NormD
Dec 9 2008 at 3:10am

Mr Hazlett can think of no innovation in the unlicensed spectrum since 1997. How about:

802.11b 1999
802.11g 2003
802.11n 2008
UWB
802.15.x 2003, 2006, 2007
Wireless home security systems
RFIDs
RF remote controls
Wireless automobile entry systems
Bluetooth
Wireless USB

In Mr Hazlett’s perfect world where all spectrum is owned, we would have to pay providers monthly fees for WiFi access points, garage door openers, baby monitors and wireless remote controls!

As to the innovation in the owned spectrum, this is almost all in the user interface and applications, not on the network itself. The wireless carriers are trying hard to roll out 3G network with bandwidth well below 1st generation WiFi.

I am not opposed to some spectrum being owned, but I and many people are violently opposed to the characterization of unlicensed spectrum, where everyone is free to experiment, as “wasted”.

PT
Dec 23 2008 at 2:38pm

@Ak Mike: “There is something in the temper of the times that makes it impossible for a seemingly intelligent poster like John Rosendahl to understand that regulation is far more likely than leaving the pipes in the hands of corporations to result in the stifling of innovaton.”

There is also something in the temper of the times that people get so blinded into one issue and lose track of a greater perspective.

Have you considered that maybe some people are willing to sacrifice a little on the current pace of innovation, which quite frankly is currently very very high, in exchange for greater guarantees of user-side basic freedoms? I, for one, am willing. I simply do not want AT&T or Comcast (the “duopoly” in my area) being the judge of what access I have.

So, what would you rather have: more innovation with growing corporate-sponsored censorship, or less innovation, but freedom of speech?

Innovation is not the “end all be all”. It’s not the pursuit of “Life, Innovation and Happiness”, is it?

Comments are closed.


DELVE DEEPER

About this week's guest:

About ideas and people mentioned in this podcast:Books:

Articles:

Podcasts and Blogs:


AUDIO TRANSCRIPT

 

Time
Podcast Episode Highlights
0:36Intro. Telecommunications. What is net neutrality? Many flavors of proposed legislation. Argument is made that the Internet has been created according to particular blueprints of architectural design, basically separating the edge of the network where applications are provided to end users from the core of the network, an interconnected layer of networks. Argument is that this has worked well, allowing inventors to supply new applications without reinventing the wheel, supplying their own delivery network. Extended to say that this structure is endangered by the emergence of large network providers--e.g., AT&T, Comcast, Verizon--which have constructed considerable infrastructure to bring the net to small businesses and households. If left to their own devices they may close off access. To protect the vertical integration, regulation is needed. That's the argument in favor of the idea that certain providers should not be able to restrict certain kinds of access. Many complications.
4:40Counterargument: Internet was not invented according to blueprints but evolved spontaneously. Networks build infrastructure because they own them and can charge market prices. Data: Those networks interconnect not because of any mandate but because a spontaneous system evolved. Marvel of spontaneous market structure. Large degree of specialization. Transport networks tend not to get heavily involved with content. Regulatory structure has been minimal, markets open in the sense of less regulated. Many "violations" of net neutrality--that is, network providers often do stick their fingers into the services their customers receive. Valuable services--malware, spam filtering. Economies of scale and efficiencies. Other examples: vertical integration of a company like Google, that maintains its own physical transport network--lightning fast. Globally constructed on dedicated bandwidth that is owned by Google, even though they are a content provider. Edge of the network. Own the innards of the transportation system over which their information travels, as well as multiple computing farms, proprietary. Important deals cut by Google with Internet service providers; according to Vise biography of the company, May 2002, Google paid cash and warrants to AOL, which at the time was one of the dominant service providers in the United States. Google Search Engine to pop up versus Yahoo. Expensive, risky but they paid. Direct violation of classic net neutrality. Network could charge content provider to have preferential treatment on the startup date. Wireless ISP, Clearwire, attempting to bring wireless broadband to the public. Google gets preferential treatment on the Motorola handset used. Both transportation and content bundled, vertically integrated. At a high level Google argues for net neutrality--leading visible corporate champion.
13:09Challenge is to think broadly about competition. Tendency to forget that competition is a rich term. Each iteration of success alarms people. When IBM was biggest competitive threat with dominant market share of computer business, it turned out that the mainframe business was overtaken by desktop computers, and then Microsoft. Now people worry about Google being the behemoth. Monopolists versus losing position to a competitor. Antitrust issue. Should people be worried? Calibrate the trade-offs. Generational monopoly fears. Major antitrust suit against IBM under Lyndon Johnson filed 1969, never reaches final opinion, and dropped 1982 under Reagan. Lawsuit goes 13 years in the computer business. Not a matter of fire-power but of the nature of the issues. When dropped, Microsoft had already laid the framework for what would become the new monopoly by getting into the operating system business, which nobody understood would be the new cash cow. Not even Microsoft understood that. Did get software platform ubiquity right, good enough. A decade ago Microsoft accused of monopoly power; seems long ago. Lost the case, which was over the browser, Netscape Navigator vs. Internet Explorer being used to protect its operating system. Java applications, virtual operating system that would make Microsoft's a commodity. Lots of competition today in browser market. Unforeseen competition. Microsoft lost out to Apple, which came back from the dead with the iPod and now the iPhone.
22:02Ironic: Everybody said the lesson Apple never learned was open systems and then they come back with two totally closed devices that are successful. Openness is in the eye of the beholder. Apple's mistakes; wouldn't license their operating system; Gates urged them to do it. Now, same business model has proven very valuable. Apple machines, because they are not as open as the Microsoft machines, are much more resistant to malware and spam that users have to run interference on. Beautiful machines. Model may have been inefficient in 1985 and efficient in 2008. Antitrust element: 1995-96 Apple, having tough times, reached a merger agreement with IBM, OS1 supposed to be killer app for software; Apple left IBM hanging. Apple agreed to sell; insisted that there be no collaboration on Mac-OS till merger approved. Long lag by antitrust authorities; IBM said it was unable to move forward. Competition policy unproductive. Identify MS as now the anticompetitive force, to say that these two robust rivals can't get together and put together a product that can take market share from the dominant Windows operating system, that's what was barred by costs of antitrust.
28:24Challenge of regulation: can't have it be flexible because then it becomes discretionary. But if you have arbitrary rules it leads to bizarre results occasionally. Might want to recalibrate the policy. Want to look at whole scope. Back to net neutrality: One of the things people are worried about is illustrated in cable TV markets--get a dominant provider who has a monopoly from the government, economies of scale; worried to be in hands of a small group of folks who cut us off from things we want to have. Not usually a way to make money, but the claim is that they can exploit consumers because there won't be ease of entry that we usually see in competitive markets. Market power can be exerted by the transport providers, access networks, thwart creativity. Cable TV example: Claim that cable providers have vertically integrated and are accused of exploiting their market power to thwart consumer choice is a bad argument. Vertical integration has been highly efficient in cable and has developed after being suppressed by regulators has had some significant market power problems but at the local competition level. Want multiple providers of multi-channel programming. People think consumer choice comes on the Internet where you shop for things one at a time; not the way cable works. May be highly efficient to give consumers choice of blocks of hundreds of channels. Consumers might prefer that because it's so cheap. Since 1994-96, two rivals--at least 3 operators and getting a fourth with phone companies. Traditional cable operator with initial monopoly franchise. Then DirectTV; then in 1996 EchoStar, the Dish Network. More recently AT&T and phone companies allowed to provide video services, or broadband service providers. Not trying to restrict output like a monopolist any more--trying to get sufficient scale to survive. Bundles of channels sold. Can think of other ways--video streaming--but costs and customer preferences seem to be toward bundles, as illustrated by the behavior of entrants. Premises equipment improvements.
37:33Spectrum issues. For wireless services, radio broadcasting of the last 100 years to cellular telephone service to in-home cordless phones and wi-fi all use radio waves. Need to impose rules on the radio spectrum to avoid the tragedy of the commons. Not to eliminate interference, but as Coase suggested to figure out what interference is worth it. Want rules that allow people to make the right allocations. Input into wireless services--can think of bandwidth like a lot of other inputs--more bandwidth results in more services, more calls, more speeds, but at a cost. Need for coordinating that usage. Traditionally it has been one of administrative allocation, regulators try to figure it out. Micromanagement. Coase, 1959, thought you should distribute ownership rights so people could negotiate the optimal interference rules. You could own a frequency in some megahertz range; neighbor wants to use it or has technology to use it, you could trade. A year later, the Coase theorem as a spillover. To date, government has decided who gets what. Could distribute rights, or auction them off. Have to define property rights in a way that allows rights over spectrum. Have to then decide how to assign those rights. Congress likes the discretion and having more rents determined by regulatory action. In 1993, Congress and Presidency in same party's hands, new technology ready to be issued for mobile phone service; Congress voted to authorize auctions, raised $50 billion. Used to be a big number. Actually a small number related to the markets rather than the budget. Wireless phone market generates $150 billion a year in consumer surplus, consumer-side value. More spectrum, more competition, more capacity.
44:20Are there parts of the spectrum that are being withheld and not available to consumers? Regulators say no, no more spectrum. License given in 1990 for making phone calls in airplanes, called the last spectrum available to be given out. Since then, hundreds of bandwidths go out. Spectrum is way underutilized. Television band allocated late 1930s-late 1950s. February 2009 still has 49 TV channels allocated for off-air TV. Fewer than 1500 TV stations in America, about 8 per market. Who has the right to use them? The regulators, who withhold the rights. In 2002, devices that don't interfere allowed to use these white spaces in the TV band. FCC gets to approve them on an unlicensed basis, government designing the rules for using the spectrum. Should be able to migrate to another technology, like over-the-air satellite, but need a new agent that owns the spectrum to efficiently reallocate it. Tragedy of the anti-commons--no residual claimant except the regulators who have mismanaged it. That spectrum is worth a lot for, say, wireless broadband or wireless voice--mobile telephony, but currently no means to allocate it. Political football. Social engineering without opportunity to ask the right question. They are asking: What devices can use the TV white space without interfering with the over-the-air digitial TV broadcasts? Right question is: How can we get the over-the-air TV broadcasts to be more efficient in their use of the spectrum? Regulators are protecting over-the-air broadcast television. Extremely low power limits. Central planning problem. Champions of unlicensed whitespace, like Google and Microsoft, are pushing central planning approach that would be dominated if we had a property rights approach. Could put all the broadcasting on satellite and get everybody a satellite dish for $3 billion.
54:01Irony: TV and radio are slowly being pushed out by alternative technologies. Internet is a provider, iPod or computer or radio. Real issue is innovation that doesn't take place elsewhere. If you allowed Google itself to own some slice of the TV band it could make deals; they would unleash fabulous innovations. Frustrating to see that that path is foreclosed by unimaginative regulators. Well over 35 million have wireless broadband. Perhaps drinking different flavor of kool-aid. Politics versus economics.
57:02Hazlett was chief economist at the FCC. What was the most interesting or surprising thing you saw? All kinds of margins you haven't thought about. Learn how constrained agencies are--look at good ideas. Equilibria determine policies, people pushing on all sides of the issue and it comes out a certain way. Also all kinds of idiosyncratic parts of the process. Who the committee chairman is and elections, lots of institutional road bumps. Reliable way of figuring out the economics of different programs: who is proposing what set of rules and why are they doing it? 1991-1992 at the agency working on PCS--personal communication services--cellular duopoly, in the 1980s, two per market. Now paying 6 cents per minute. Assumed operationally that AT&T would be an obvious licensee, national player, in PCS. In fact, AT&T was outspoken, initially petitioned that it as a potential entrant it needed a national license rather than the many that were distributed to the communities. Ultimately withdrew and switched around and became an incumbent. Shows you where the efficiencies are.