Bruce Schneier on Power, the Internet, and Security
Jun 10 2013

Bruce Schneier, author and security guru, talks with EconTalk host Russ Roberts about power and the internet. Schneier argues that the internet enhances the power of the powerless but it also enhances the power of the powerful. He argues that we should be worried about both corporate and government uses of the internet to enhance their power. Recorded before news of the PRISM system and the use of Verizon's customer information by the NSA (National Security Agency), Schneier presciently worries about government surveillance that we are not aware of and explains how governments--democratic and totalitarian--can use the internet to oppress their citizens. The conversation closes with a discussion of terrorism and the costs of the current system for reducing the probability of a terrorist attack.

RELATED EPISODE
Bruce Bueno de Mesquita on the Political Economy of Power
Russ Roberts talks with Hoover Institution and NYU political scientist Bruce Bueno de Mesquita about his theory of political power--how dictators and democratically elected leaders respond to the political forces that keep them in office. This lengthy and intense conversation...
EXPLORE MORE
Related EPISODE
Barry Weingast on Violence, Power and a Theory of Nearly Everything
Barry Weingast, Senior Fellow at Stanford University's Hoover Institution and the Ward C. Krebs Family Professor in the Department of Political Science at Stanford University, talks about the ideas in his forthcoming book with Doug North and John Wallis, A...
EXPLORE MORE
Explore audio transcript, further reading that will help you delve deeper into this week’s episode, and vigorous conversations in the form of our comments section below.

READER COMMENTS

rhhardin
Jun 10 2013 at 10:41am

I think power isn’t a thing but a confusion of several things. In English, it serves as a marker in an account rather than a reference to something.

It has a place in cliches, in other words.

But it’s a reification error like phlogiston, “what causes fire,” as if it then exists, and can like power then be sought, acquired, lost, and traded.

Some Critical Inquiry editor long ago, who pointed this out in an issue preface, suggested distinguishing the Latin auctoritas, potestas, officium and imperium, if you want to refer to something beyond a cliche.

Wolf
Jun 10 2013 at 11:18am

This was frustrating to listen to. Bruce’s criteria for something bad and in need intervention is that he doesn’t like it. He complains about iPhones and Microsoft phones as somehow bad for society because, surprise, the company tries to get customers to use its product exclusively.

Russ needs to invoke Public Choice more when guests conflate corporatism with free-markets.

Casey
Jun 10 2013 at 11:28am

So terrorism being overrated as a threat relative to disease, car crashes, etc – translates nicely to numbers.

The notion that surveillance and censorship are underrated as threats also seems convincing.

And I happen to agree with both.

It still always feels a little presumptuous to dismiss the crowd’s wisdom as misguided because “humans are not adapted to a modern world”, “most people don’t think about these issues in a meaningful way like I do”, or something similar. “I am just more enlightened” is a seductive paradigm.

It sort of starts feeling like efficient markets vs investment brokers.

Perhaps a) we don’t have an accurate view of what ‘everyone’ thinks. Perhaps b) there are reasons these threats are not overrated that aren’t being considered. Is it possible that reacting to the last threat quells the entrepreneurial tendencies of terrorism? Meaning, the reaction is more of a response to the potential for where this threat could grow into the future more so than a reaction to the current probabilities?

Maybe car crashes, disease, climate change, privacy are larger concerns, but the solutions are more complex – and perhaps people simply have more visceral reactions because they intuitively recognize that this is a problem that can more readily be addressed?

I don’t know.

John Berg
Jun 10 2013 at 11:50am

Adding to the first two comments: please define precisely how you both are using the phrase “metadata.” Separately if you two have different meanings.

John Berg

Lauren [Econlib Editor]
Jun 10 2013 at 12:02pm

Hi, John Berg.

The term “metadata” is not used in this podcast. Neither is it used by either of the first two or any other commenters. Did you mean something else? Did you intend your comment for a different thread?

Kestrel
Jun 10 2013 at 12:55pm

Perfect timing on this podcast, Russ. Accidental, but perfect.

Woodah
Jun 10 2013 at 2:02pm

Quibble: Magna Carta did NOT involve serfs! It was lords versus the monarchy. A power-sharing squabble within the 1% of the time.

SaveyourSelf
Jun 10 2013 at 2:46pm

Great Podcast!

Schneier is well-read and argues his positions well. Russ was on his game also. One of his finest performances in my opinion.

Schneier’s revelations about freedom, reward, and risk are exceptional. Namely that they all move together. In his words, “The proper response is to accept that life entails risk…That’s not a failing.” Alan Greenspan made a related comment in “Age of Turbulence” when he wrote, “Rational risk taking is indispensable to material progress” (503).

The problem with including “risk” as an element required to make rational decisions is that humans are notoriously poor at calculating it. We methodically overweight uncommon events and underweight common ones (Daniel Kahneman, “Thinking Fast and Slow”). This distortion must have given humanity some sort of survival advantage in the past but it is a plague upon our current standard of living. It leads to bubbles, bureaucracies, and bad decisions to name but a few. Add to that the fact that quantifying future risk is difficult and mathematical—where most humans aren’t inclined to even do simple math—and you are left with a recipe for misfortune repeated over and over through human history.

What I would have liked to hear from Schneier and Roberts [this is not a criticism] was their thoughts on what could be done to improve on our dismal track record of managing risk. Schneier hinted that the media overplaying uncommon events may be part of the problem. Would more balanced media presentations aid our decision making? Some of the better NPR reporters approach questions from more than one perspective. I’m not sure how well 24 hour media does in this regard. And what about education. I know accurate risk assessment can translate in to a significantly higher standard of living. Warren Buffet makes comments all the time about the importance evaluating risk was in making his career. Is it possible to teach mathematical, statistical risk assessment techniques to kids in ascending fashion from elementary school up through high school? Would that teaching translate in to better personal—and therefore societal—outcomes? I have young kids and I honestly don’t expect them to learn these things in public school, but what if they could…

drobviousso
Jun 10 2013 at 3:30pm

Great listen, but I wish that you both could have stipulated that ‘big data’ only has a few players that are each big enough to directly affect the market. Thus, Russ would not need to defend market principles that don’t actually apply in this case.

Bruce is right that you can’t “exit” big data right now, just like you (apparently) can’t vote in a president that would, you know, curb government surveillance and extra-judicial killings and what not.

The observation that private actors use the government is old hat here, but the reminder that government uses the private sector to do what it can’t do was very well put. And quite prescient, given the news cycle of the last week or so.

Greg G
Jun 10 2013 at 8:55pm

I thought this was one of the very best EconTalk episodes ever and I have listened to most of them. Guest and host both did a great job.

Especially interesting was the difference of opinion about whether we should be more worried about government power or private corporate power and the ways that both government and corporations are taking advantage of laws intended for the other to increase their own power.

In a podcast advertised as being about the internet and security it would indeed have been “another topic” to go further down that road today.

Russ I hope you will have Bruce Schneier back again as soon as possible in another podcast that does go there and explore these differences further.

James
Jun 10 2013 at 9:26pm

Good episode. However, most of what Schneier said comes across as paranoid speculation. While his concerns may seem prescient to some, I don’t think speculation about Google swinging an election or the NSA abusing their surveillance powers is going to worry all that many people. Personalized search results is not “censorship”. To my mind the NSA gathering Facebook and Verizon data does not constitute “surveillance”. Phone metadata and Facebook posts seem like fairly public information to me. You don’t put anything truly private on Facebook, you post something because you want to SHARE it. Surveillance is a hidden camera in your bedroom, a pair of agents shadowing you in a tinted car. Besides, most of us simply don’t have any secrets that are meaningful or valuable to anyone outside of our circle of friends and coworkers.

I’m not scared of any of this electronic “power”, and I don’t buy the argument that it is comparable to real-world counterparts. If the government is watching me, then why is it so easy to get away with illegal activities?

Matt
Jun 11 2013 at 3:27am

Great blog and podcast.

To me, the most interesting topic in this episode was the comparison of government power and corporate power. To me, they are not even close – governments, in their domain / geography, are more powerful than any corporation.

The example provided by Schneier (Ecuador vs. Exxon) is off the mark, as shown by Bolivia’s theft from Exxon. In their domain, countries are unbeatable. Banks (or he would say people in banks) are corrupt? Maybe, but shareholders can sort it out, and there are laws, like those against insider trading, they have to follow. The government (or the people inside the government) exempts itself from insider trading laws.

We get to vote for government representative on a very infrequent basis. We get to vote for or against companies through purchases and investments every day.

Justin P
Jun 11 2013 at 11:04am

It is, I think, just as hard to leave Facebook as it is to leave the city you live in because you don’t like the government.

He really lost me there. I wonder if he would have made the same case for Friendster or Myspace a few years earlier. I think he is substituting subjective with objective, in that it is objectively hard to leave a city and not objectively hard to leave facebook. One requires securing new work, housing, moving trucks, etc. And the other requires….three clicks.
Maybe subjectively both are hard for certain people. It would not be hard for me to leave my current town, given a new job prospect. And it would not be hard for me to leave Facespace, even thought that’s were the best anti-GMO people are and I do get a great joy arguing with them. But I can see how it could be hard for some people to leave for personal (ie subjective) reasons.

As for his assertion that Exxon has more power than a South American government…I wonder if he ever heard of Rafael Caldera or Hugo Chávez? It seems to me like he is cherry picking to suppor this obvious bias. A nice example of King’s oppressor/oppressed from last weeks podcast.

Jonathan Vaage
Jun 11 2013 at 11:55am

What Schneier doesn’t consider is that the reason large tech companies such as Google, Microsoft and Apple pose the threat he describes is because of the market distorting enforcement of alleged “intellectual property rights” via coercion of the state in order to maintain these corporation’s oligopolistic market positions.

There are some very persuasive arguments against the legitimacy of intellectual property that are beginning to emerge among libertarians and Stephan Kinsella does as good a job as I’ve seen present them. See link to his presentation:

I would love it if Russ invited Stephan onto the show. As a libertarian legal theorist and expert on IP and international law, I think it would make for an extremely stimulating episode.

http://www.stephankinsella.com/paf-podcast/kol062-intellectual-freedom-and-learning-versus-patent-and-copyright-2010/

[broken link code fixed–Econlib Ed.]

Mads Llindstrøm
Jun 11 2013 at 3:27pm

Do the Internet level the playing field?

Maybe it seems different in America, but let me tell it from a Danish perspective. A perspective that will more or less apply to different European countries.

In the early eighties Denmark, we had one state run television station and several state run radio stations. We also had private radio and television stations, but they were (by law) so small that they simply could not afford serious journalism. Typically, the private radio and television stations would play music, do quiz shows, soaps, porn, and second rate movies.

If you lived close to Sweden, you could see two state run Swedish television channels. And other people could see German television (I don’t know if they were also state run).

The Danish state run television station (now stations) were not (and are not) micro-managed by the siting government. Contrary, they have a good deal of self-direction, but the state run electronic media do and did have a thick bias.

Luckily, Denmark had and has privately owned newspapers. This of cause provided a good deal of balance. But as most people get their news from electronic media, the state run institutions are extremely powerful.

It did improve in the eighties. We got another state run broadcasting company, which did dilute the power of the first state run broadcasting company. It became legal to own a satellite dish – Yes, I kid you not, there was a period were you could not legally own a satellite dish in Denmark. And we got cabal TV.

Now fast forward to present day. There is still no privately owned Danish television station, which has any serious journalism, as the market is saturated by state run news television. We can get CNN and so on, but naturally they do not cover Danish politics much. And we have a ton of other television channels.

And we got the Internet. The Internet were I can read all Danish newspapers cheaper and more conveniently than in dead-letter-only days. Plus there are many other Danish Internet news sources. I can listen to Peter Robinson’s excellent Uncommon Knowledge and to equally excellent Econtalk – neither of which would ever be aired on state run danish electronic media. Plus of cause, a ton of other Internet media. I can quickly check the accuracy of the, in my view, garbage the local papers will often write about America.

So yeah, the Internet did level the playing field and made the world a lot better place. At least from my perspective.

Christian
Jun 11 2013 at 5:46pm

In reply to Justin P:

Your comment got me thinking. Leaving Facebook is technologically quite easy, I agree. However, once you’ve made those three clicks, *remaining* off the equilibrium path, so to speak, is a bit more difficult. It’s sort of like not having a car in a suburb designed around automotive travel, or not having a mobile phone these days, or not having a TV 15 years ago, or not having a radio or a newspaper 50 years ago. Everyone else is using a particular tool to the point where not having the tool puts one out of step with one’s surroundings.

I’ve been off of Facebook for months and I am very much out of touch with my social circle. My wife keeps me updated on what I need to know, but if she were off of it as well, we’d be almost ostracizing ourselves because so many of our friends and family use it as a communication tool.

So, in that way, tools like Facebook are actually quite powerful in their own way. They create a compelling equilibrium and one bears costs (and, to be sure, reaps benefits, like saving a dozen hours a day from being wasted…) by remaining outside the equilibrium.

Facebook is not as powerful as government or even something like Google, of course. Google’s power is something I’m a little more concerned about. Its potential for biasing of what people know and don’t know, what questions they ask and don’t ask, is more subtle and pernicious. And it’s often the entryway to the internet for people looking for information on a topic. In that sense, it has as much potential power as the entire internet, because it is the gatekeeper of what portion of the internet most people actually get to.

Emerich
Jun 12 2013 at 8:16am

Matt and Justin P make good points in the comments above. Schneier is a good example of someone on Kling’s “oppressor vs. oppressed” axis of world-views. Google is a bit over a decade old, Facebook less than a decade, yet these two companies are avatars of entrenched corporate power and oppression? More powerful than the U.S. government? Like IBM 30 years ago and Microsoft 15 years ago, perhaps? (i.e., not so invulnerable.) I share Schneier’s and Russ’s fears, up to a point, but Schneier’s perspective is simplistic, cock-sure of himself as he is. Islamic extremists really do want to kill Americans and others (including other Muslims), lots of them if they can. Get too relaxed about the threat and we would get another 9/11 or much worse. And by the way, one 9/11 per month would be much worse than annual auto fatalities. The latter don’t generate such huge economic costs as 9/11—as high as $2 trillion according to this source: http://www.iags.org/costof911.html. Anyway, most of us get more upset about spectacular threats like plane crashes and terrorist attacks, as Schneier concedes. Does he have a plan to fix that problematic quirk in human nature?

Ali McMillan
Jun 12 2013 at 9:56am

Great podcast. I liked where both Bruce and Russ agreed that the biggest worry is when excess power from government and business come together. I think Bruce put it really well, and both liberals and libertarians should really keep this in mind: neither power is inherently good or bad, provided it’s limited in scope (and of course the proper scope of each is debatable), but the absolute worst thing about the current system is that each uses the other to bypass its limitations.

Governments can coercively obtain any of the mass quantities of information collected through private surveillance, while corporations can exploit government’s coercive power for rent-seeking purposes.

Libertarians understand this well… so why, then, is Russ less scared of private surveillance than government? Even if in principle he thinks the former is OK, he’s got to recognize that practically speaking that is also just government surveillance, because they can compel the owners to give up the tapes at any time. Seems like more a matter of perceptual bias than any real evaluation – and this example is much more salient than the specific Facebook one. Whatever the challenges are, you can quit Facebook, but you truly have to become a deranged hermit if you want to avoid all private surveillance.

And the only way to truly uncouple this from government surveillance would be for a radical change in law making it purely voluntary for those tapes to be provided to the government – so Joe who runs the convenience store can say, ‘why do you want that tape?’, and hand it over if it’s to catch a bomber, but not, say, a drug dealer. Great idea, I think, but would seem like an anarchist fantasy to the majority of Americans.

Also, in response to many of the previous comments, I think that sophisticated understanding of statistical reasoning is unnecessary if we want to evaluate terror threats in a sober, rational way. Reading Kahneman and behavioural economics generally has me seeing the world much differently. And add on the fact that fear-based political and military responses to terrorism simply exacerbate it? I literally could not care less about terrorism. I worry more about the nutritional content of what I eat for lunch every day than I have ever worried about a terror attack. And rationally so. My lunch is exponentially more likely to kill me.

To some extent, biased risk assessment is part of human nature. But that’s easily tempered with a bit of simple behavioral economics, and perhaps more importantly, cultivating a healthy stoicism.

And finally, this Onion article reminded me of Russ’ position in this podcast. Heh.

Just teasing! Keep up the great work.

James
Jun 12 2013 at 11:38am

The NSA isn’t just trying to prevent another 9/11 or boston marathon bombing, they are trying to prevent someone from setting off a nuclear bomb in New York or DC. I live in Iowa so no, I’m not afraid of terrorism, but some people are living in a big juicy target.

Ed Dentzel
Jun 12 2013 at 1:04pm

It must be an upside world Bruce Schneier lives in where corporations like Google and Facebook who haven’t killed anyone are feared, and terrorists who’ve killed millions roam free unless caught by un-intrusive methods. In his world, a corporation with a customer’s address would be brought up on charges while terrorists continue to fly planes into buildings because trying to stop their #1 method would be considered “mission oriented.” Has Mr. Schneier ever heard of the principle, “People will keep doing the same thing as long as it works”?

Mr. Schneier made fun of John Ashcroft, and Russ went along with it. Have both of them forgotten the first strike on the WTC in 1993? Granted, it lies outside the timeframe Ashcroft cited but I’m not sure that changes anything. To hear Schneier, he’d have you believe terrorism never occurred in the USA before Sept. 11. The timespan between the two strikes was 8 years. So, what if in 2009, Eric Holder would’ve said, “Our tactics are working because there hasn’t been a major terrorist attack in the USA in the last eight years.”? How would that’ve struck Mr. Schneier?

Mr. Schneier is trying to convince us that suffering from terrorism is a fact of life because it’s “rare.” We have to live with it. But never does he say where the “get majorly ticked off” point is. Is it 1000 citizen deaths a year? 2000? 3000? I wish he would say. I guess Winston Churchill should’ve said, “Those V-1’s, what’s the big deal? It’s not like they happen every day.”

Whereas, I would say a country that can’t keep enemies from coming in at will and killing innocent civilians isn’t a country at all–it’s just a bunch of people living close to each other. This goes for embassies as well. Schneier’s world: Benghazi–can’t learn anything from that. Just bad luck. Happenstance. A coincidence. Wrong place at the wrong time. A statistical anomaly.

The funny thing is this. Wherever Mr. Schneier lives I am sure it is very nice. Great neighbors. Clean streets. And the data points show the odds of his house being broken into are very low.

Yet, he still locks his doors. Go figure . . .

Julien Couvreur
Jun 12 2013 at 1:31pm

Thanks for the good interview. I really like Schneier, as his writing on security and software got me interested again into economic thinking (ie trade-offs).

I liked that you pushed him a little on corporate vs government power. It was good to hear his arguments.
That said, I still don’t understand how he thinks that government regulation could somehow improve on problems.
All the problems that he identifies (lack of consumer interest for small stuff, lack of consumer leverage, potentially mis-aligned incentives for employees making decisions, etc) apply to government too, except that government has a unique monopoly on taxation and has a unique externality (decisions apply to all citizens).
He thinks competitive forces are not perfect or strong enough, which is a fair point. But dynamics of politics are only worse, unless one of Milton Friedman’s “angels” gets in power and regulates us (or maybe Bruce as benevolent technocrat ;-).

A few scattered thoughts:
1) If a corporate magna carta would be very valuable, then some companies can already commit to such declarations and this should be a valuable competitive strategy.
2) It is ironic to have Bruce attempt to lecture Russ on methodological individualism (corporations are not actors, only individuals are actors), since that is something Russ is very keen on already.
3) There was a little miscommunication on the problem of corporate power and privacy. Schneier was thinking of Google recording people in the street (un-owned space), while Russ was thinking of a store or mall recording its customers on premise (store property).

For next time, I would be curious to hear more from him on security and the internet.

Ali McMillan
Jun 12 2013 at 1:54pm

Wow, there are some funny responses to this podcast. These debates bring out the full spectrum of silly political positions one might adopt.

We’ve got naive progressive liberalism: corporate power is uniformly bad and scary, government power is uniformly good EXCEPT when it comes to the military and national security. That’s what some (wrong) people appear to have heard Bruce saying, and that’s the tacit worldview of some (dumb) opinion columnists discussing this NSA story.

Hence they respond, at best, with naive libertarianism – corporate power is fine, government power uniformly bad, and where corporate power looks problematic it’s probably the government’s fault.

That argument I don’t wholly buy, but it’s more persuasive than naive liberalism and WAY more persuasive than naive conservatism, which inverts the premises: corporate power is uniformly good, government power uniformly bad, EXCEPT when it comes to the military and national security. The sharp end of government’s coercive power is given an absurd free pass by the same folks who consider themselves devotees of liberty.

Both Bruce and Russ avoid each of these distinctively misguided ways of thinking, agreeing that the really monstrous stuff happens at the intersection of government and corporate power.

But I suppose people just hear what they want/expect to hear… as any cursory look at Schneier’s writings will show, he’s not some anti-surveillance ideologue who thinks we should ignore terrorism and accept it as a fact of life. He thinks we should look rationally at the risk it poses in comparison with other risks that we fear less viscerally; that we shouldn’t regard this threat as different, special, and thus meriting new expansions of government power; and we should focus on evidence-based techniques that will actually prevent attacks better than the existing TSA-style ‘security theater.’

Indubitably, just because I’m defending Bruce, someone will interpret this comment through the framing of the ‘naive liberal’ ideology I outlined above. I could argue how wrong that is, but I doubt they’ll read to the end in any case.

Geoff. Nicoletti
Jun 12 2013 at 2:05pm

[Comment removed for supplying false email address. Email the webmaster@econlib.org to request restoring your comment. A valid email address is required to post comments on EconLog and EconTalk.–Econlib Ed.]

Andre Pereira
Jun 12 2013 at 6:37pm

@ Justin P.

I think he is substituting subjective with objective, in that it is objectively hard to leave a city and not objectively hard to leave facebook. One requires securing new work, housing, moving trucks, etc. And the other requires….three clicks.

Are you familiar with the inner workings of the web? I’m not trying to sound presumptuous, it’s perfectly normal not to be, and in fact, it’s because most people aren’t that I reply.

This very page where you’re posting this comment, here on Econtalk.org, is loading code to your browser from Facebook, Google+ and Google Analytics. Using those, Facebook and Google can know and record that your browser visited this page. Using various techniques (cookies are the most common), they can link all those pages to a single profile, tracking you across the web.

Now one could claim that while they can do it technically, it’s paranoia to think they’d actually track people this way.
Well, here’s what they said to USAToday:

The company compiles tracking data in different ways for members who have signed in and are using their accounts, for members who are logged-off and for non-members.

Is it really that easy to leave Facebook? I think it’s hard not to get sucked in!

MaxH
Jun 12 2013 at 8:00pm

Very enjoyable podcast, although I think that there may be more than a little unacknowledged tension inherent in the tendency to scoff at “hindsight bias” while simultaneously holding to an unshakeable faith that history has proven (x) to be hopelessly corrupt, untrustworthy actors, where (x=states, corporations, Yakistonians, Pastafarians, insert-your-preferred-boogeyman-here). Also, If the problem is understanding “what should be done,” and the idea of “hindsight bias” has any validity at all, then it’s probably just one aspect of a more general, and more insidious tendency to self-delusion, and is likely accompanied/reinforced by a matching tendency toward “foresight aversion,” i.e., the unwillingness to acknowledge that someone else was (or is) likely right about the future, when I was not (or when that perception was/is contrary to my own past/present interests). It’s easy dismiss dumb hindsight bias of the “magical thinking” variety, as Bruce put it. But better to simply dismiss it for its dumbness rather than its retro-speculative roots; otherwise one risks falling into the Cartesian/Hayekian rathole of absolute skepticism.

Kount von Numbacrunch
Jun 12 2013 at 8:03pm

This topical podcast was very interesting. Here are my early thoughts.
1) It’s good to hear a grown-up discussion that doesn’t ignore the internet’s usefulness for spying. It allows people far apart to find each other and connect….under potential surveillance. But at least, it helps us find alternative sources of information and entertainment. People should think more about how digital communication and location data gathering facilitates blackmail. This can revolutionize “opposition research”. Never mind “Game of Thrones”, watch “House of Cards”.

2) It has been widely reported that Google billionaire Eric Schmidt started an organization to use info tech to help Democrat party electoral campaigns and etc, by shaping opinion. You might not think “big data” can tip elections, but it seems that those who know most about IT and about elections think so. http://www.businessweek.com/articles/2013-05-30/googles-eric-schmidt-invests-in-obamas-big-data-brains

3) As big corporations and big governments get cozier and cozier, I’m not sure an argument about which is more powerful has much significance.

emerich
Jun 13 2013 at 10:10am

Maybe he’s right but I still have to laugh: Google, Microsoft, Apple, and anything corporate are more threatening than our irrational human natures allow us to appreciate. Terrorists, even if they inflict the equivalent of one 9/11 per month, are no more dangerous than getting into our cars for a drive.

Ben
Jun 13 2013 at 11:02am

I do not understand Bruce Schneier’s assertion that corporations are just as powerful as governments.

All of a corporation’s power are given by the government through regulation which requires armed individuals to enforce.

If the government wasn’t able to regulate the market then corporations would have no reason to influence politicians.

Stephen
Jun 13 2013 at 9:33pm

He says governments and companies are equal threats, but I think he forgets how fragile a company is, that companies–even big companies–collapse all the time. They REALLY DO need to respond to their users’ wishes, or another service will. In 5 years, will Facebook still be relevant, or will some other service figure out a way to lure users over with guarantees of better privacy and content management? 15 years ago, he would’ve had us fear the AOL juggernaut.

Nope, I feel way more threatened by a government that can use threat of force to squeeze my dispersed private information out of various companies and compile it into a single PRISM in Utah, AND that they can do it without any intention of telling me that they’re doing it.

Great conversation!

[“He” refers to whom? Sorry to ask. I think it helps to refer to people by name. If there are other references or names, perhaps that could possibly help.

Robert
Jun 14 2013 at 9:01am

I have to agree with Ben. Corporations are only powerful because they are able to buy that power from the State. The source of governmental power originates with the consent of the governed. It is the people who have the actual power. A portion of that power is surrendered to government through the act of elections, hence “the consent of the governed.” Bruce Schneier forgets that corporations are only powerful because the State has put that power up for sale. As H.L. Mencken once said, “An election is nothing more than an advanced auction on stolen goods.”

Chris Carlin
Jun 14 2013 at 11:42pm

Firstly, a nod to Wolf’s comment, above.

This episode was really hard to listen to. Schneier is one of these folks who never seems to have anything insightful or well-founded to say, only things that his audience accepts through confirmation bias.

I disagreed with statement after statement that he presented as unquestionable fact (For example, one can’t live without facebook; one can’t live without credit card? Of course one can; both are done plenty!). Sometimes he even proposed a conditional claim before using it as a hard claim in his subsequent conclusions.

But most fundamentally, he seemed to lack the self-awareness that what he was demanding was the power for himself! He talked so much about putting power back to the people, but it was clear that meant it only so long as they did what he wanted them to.

Russ did offer a slight challenge on that point… only just.

His interview, like all of his stuff I’ve ever read, came across as rantings based in delusion and paranoia brought about through life in an echo chamber.

Rich
Jun 15 2013 at 1:27am

Russ,

Your remark about people referring to gov’t rather than the individuals that compose it, momentarily caused Schneier to break stride. Do you think he realized his point about replacing incentive problems in corporations with incentive problems in gov’t made little sense?

I doubt it. His superior tone and the way he threw around “power” indicated he had little self reflection.

I shudder at the thought of people like him looking after my best interests.

Harry
Jun 15 2013 at 7:52pm

Aside from agreeing very strongly with everything Ed Stentzel says above, I have a few observations:

1. Coercion – Coercion /koʊˈɜrʃən/ is the practice of forcing another party to act in an involuntary manner (whether through action or inaction) by use of threats or intimidation or some other form of pressure or force – Wikipedia. Game over. Using the term “coercion” to describe what Google or Facebook do is simply incorrect. Period.

2. “People never read the license agreements” – Some people sign leases, mortgages, car loans etc without reading them. People buy products and don’t check the ingredients or read the directions. This is nothing unique. To assert so is simply incorrect.

3. ‘You can’t exit Facebook’ – People do it all the time. Also, people attenuate their usage of it and in fact, among the adult population adoption of FB as a main communication mechanism isn’t at all universal. And among teens many are going elsewhere due to the presence of too many adults. Of course, I make these claims based on data http://gawker.com/teens-flee-facebook-citing-too-much-drama-mostly-cau-509159535 . Scheiner’s source for his musing goes unmentioned.

4. Demonizing Corporations and Power – He’s echoing standard Crit Theory kind of stuff here, nothing new. And no matter what he says, according to the definition of the word “coercion” above – if corporations or their agents use or threaten force against anyone legal recourse exists, both criminal and civil. And nothing he said changed the plain fact that commercial relationships are voluntary – not always easy to exit, but voluntary all the same.

5. Assymetric payoffs – Easiest example is seatbelts. People wear them all the time but maybe need them 2-3 times in a lifetime? But the “payoff” is so horrible that we guard ourselves from the risk all the time. Same thing with terrorism. He also neglects to account for attacks that were disrupted just since 9/11 – in the U.S. about 86.

More Manzi, less Marxism.

Mr Hanokokutadatu
Jun 16 2013 at 7:51pm

[Comment removed pending confirmation of email address. Email the webmaster@econlib.org to request restoring this comment. A valid email address is required to post comments on EconLog and EconTalk.–Econlib Ed.]

TravisG
Jun 17 2013 at 2:01pm

It was only discussed briefly in the conversation, but what about balancing the efficiency gains in passive public policing measures like publicly funded camera programs?

Police protection happens in two primary forms; enforcement of laws and as a deterrent from violation of the laws. While I am not an expert on the effectiveness of cameras on deterring crime or leading to the punishment of offenders, for the sake of argument lets assume they’re comparable.

Given the high public cost of employing police, one could make an argument that public funded camera programs are a more efficient use of taxes for both deterring and punishing crime. This measure would hold up as both short-term and long-term costs when you consider the legacy costs associated with public pensions and healthcare.

Would the loss of liberty be balanced out by efficiency gains? I have seen this issue discussed and I think the potential gains in the use of tax dollars are lost in the discussion of infringement on liberty.

James
Jun 24 2013 at 10:52am

Ali McMillan,

It is the binary belief that something is either “uniformly” good or evil that is the real problem here. Many seem to think that just because the U.S. government/Microsoft/Halliburton/Monsanto/etc isn’t a perfect beacon of justice, that they must giant engines of evil. That is a laughably childish worldview, yet it seems to pervade the internet.

Chas
Jun 24 2013 at 1:46pm

As a fan of Shneier’s views and work in general, I was surprised to find myself cringing many times during this podcast, and ultimately didn’t enjoy it nearly as much as I expected I would. As interesting as his views on security are, I found Bruce’s discussion of power/corporatism to be unsophisticated to the point of being almost juvenile at times (Exxon doesn’t need sales to have power?), and at other times just lacking nuance. He seemed to miss the agency problem when talking about “corporations aren’t people,” despite Russ’s subtle prodding, and was inappropriately categorically dismissive of many things (corporations don’t care about their reputations, “no one would care” if Lord & Taylor started publishing embarrassing pictures of people, etc). His understanding of convexity/fragility/payoffs in the context of power was surprisingly lacking (Google can never fail?), given that in the context of security he seems to implicitly grasp these effects (security is fragile to over-specificity in the threats we guard against, emergency response and intelligence-driven heuristics are more robust, etc). I think the conversation would have benefitted from jettisoning the power discussion altogether and focusing on security, where Shneier is a real rock star.

On the other hand, the above notwithstanding, the dramatic irony of the NSA discussion alone made this worth listening to. Pretty awesome.

vikingvista
Jun 29 2013 at 1:33pm

“There’s a social cost to not being on FaceBook.”

That social cost doesn’t include being apprehended by armed thugs and thrown into a cage.

His equating of government power with market power is exceedingly unwise, and all too common.

vikingvista
Jun 29 2013 at 2:12pm

I left FaceBook because they violated my privacy. I will never trust them again. It was easy. I haven’t missed it, and I have no intention of ever going back.

I use Google data services extensively. If they abuse my trust, I will leave them in about 24 seconds, as they have free competition for nearly everything they provide. Yes, I would retain my gmail addresses because they are already distributed. But I would forward them all to my new account and set up an automatic reply informing of my new address.

He is trying to make a point about market dominance, but the worst possible examples for him to use are Internet companies, which are exceedingly competitive, with tiny upstarts capable of easily luring customers away from the big players.

Petersec
Jul 2 2013 at 6:22am

The Internet empowered the powerless and also the powerful. If we look at the Internet strictly from technological point of view, it is clear that the powerless have everything what is needed to fight the powerful, they just need to use a little more encryption, anonymity and common sense. Yes, Facebook and Google can learn anything about you, but nobody is forcing you to use them, the choice is ours.

Chris
Jul 9 2013 at 7:08pm

Did Schneier even want to do this interview? It felt like he mostly stuck to some standard talking points and wasn’t interested in an in-depth discussion on the hobby horses he glossed over (which is too bad, because his hobby horses are highly relevant to mine).

This would have been a much better episode if it focused on one or two topics in-depth, and if Schneier came into it without the attitude that his existing analysis was complete and perfect.

Comments are closed.


DELVE DEEPER

About this week's guest:

About ideas and people mentioned in this podcast:Web Pages:

Podcasts and Blogs:


AUDIO TRANSCRIPT

 

Time
Podcast Episode Highlights
0:33Intro. [Recording date: May 13, 2013.] Russ: A lot of people argue that the Internet has empowered the powerless; and there's a lot of evidence for that. But you accept the point and add, well, it's also empowered the powerful. How has that happened? Explain. Guest: Well, it's interesting. When we first got the internet it was very quickly cleared that the powerless, the unorganized, the disenfranchised were able to use it to organize, to gain power. And we saw this from the banal to the important. We saw it in Wikipedia, in internet chat rooms or web pages for every activity or interest or proclivity, that you could quickly find people like you and organize. And this also worked for political movements. We saw President Obama use the internet in an unprecedented way. We saw the Arab Spring, where in countries like Egypt and Libya, dissidents used the internet to organize and to push for political change. And that worked. You saw a lot of writings in the 1990s that talked about this. In 1996, John Barlow wrote this great document, the Declaration of Independence of Cyberspace. And he said something like, to governments: You can't rule the internet because you have no ability to control it. It was a utopian dream, and we kind of all believed it. What we didn't realize is that more broadly, the internet technology magnifies power. And if you have power already, it will also magnify your power. But it happens slower. Right? Governments are institutions; they don't move as fast. They have bureaucracies; they have ways of doing things. And it really took them 10-15 years to figure out the Internet. But now we are living in a world where the power full use the Internet to increase their power. So, the National Security Agency (NSA) use their power to spy on people. The Chinese government use the Internet to spy on people. The Libyan government uses the Internet. Large businesses are using the Internet to increase their power. And it's not just Google and Facebook. It's things like big media, and the movie industry is using it to crack down on file sharing. So, we're seeing a broad increase in the people and the institutions using the Internet to increase their power. And it's a little bit of an open question of where this balance will end up, between the powerful and the powerless--which one will benefit more and who will end up with the upper hand. I mean, we know about from history, different technologies will perturb things. We're not sure what will happen in this case. Russ: So in a recent podcast we did with William Bernstein, he gave an example of the radio, which is a fascinating example, how the radio was originally used by tyrants to spread propaganda by the oppressor to keep the oppressed oppressed. But eventually the oppressed were able to use that technology to liberate themselves by being able, in the case of the Soviet Union, to listen to Radio Free Europe; to get information and other broadcasting information about the world that they didn't otherwise have. And certainly one of the most important parts of the Internet is to allow me, as a citizen, to get information about what is actually going on. Just some measure of the truth. And totalitarian governments' control of that information is extremely important. So on the surface you'd think the Internet has made it much harder for tyrants to control that flow of information. The only thing that worries me sometimes is that the Internet would get cut off, or highly censored by the powerful, and keep me from that flow of information. Is that something I should worry about, either in the United States or elsewhere? Guest: I mean, it certainly is. And it's not just governments I would worry about. Companies' doing it as well. You know, it's interesting. If you think about Internet control, there are several levels of control you could have. The best level is fine-grained control, where you can selectively censor, selectively monitor, add and delete messages. If you can't do that, the next-best thing you can do is to eavesdrop. You can just passively eavesdrop on everything. And if you can't do that, then the next-best thing is to destroy it. To shut it down. And we'll see governments doing all those things, depending on their technical capability. I worry about not just government censorship--which is obvious to see. You see it in China--the Chinese do a lot of censorship. And there are censorship projects which monitor countries and censorship. And you see more of it every year. I happened to be in Saudi Arabia last week and the Internet was heavily censored. I couldn't get to either porn sites or the Jerusalem Post. And more countries are doing that. Also you have to worry about, not really censoring, but message control from companies like Google. We know that your Google search results are dependent on part on your personal Google history. The kind of things you see depend on what you've asked for and looked for in the past. So your Internet in some way censored, not with a political agenda, but with some agenda that you don't understand. So, a paper published a couple of weeks ago that looked at the possibility of Google tipping a presidential election--and it was sort of interesting, it was a psychological study, a thing that--we know that people's impressions depend on their past experience--this was a question, was: Can you modify someone's impression of a political candidate based on the sorts of answers you show him? And the answer was: Of course. Russ: Oh, yes you can. Guest: Of course you can. So, you can have a situation where Google, or a company in that position could decide, for whatever reason, to show more positive articles about Candidate B, more negative articles about Candidate A. Given where we live in the United States in a country with a very slim margin between the two parties, it doesn't take a lot to shift a small number of people, and therefore the election. And what the article said is not that Google would do this, but that they could do it. And, which I think is interesting, that it wouldn't even be illegal. It would be a perfectly legal thing for them to do. This is a corporation providing a free service. They can do it however they want. There is nothing to say that they have to only show balanced articles, or a balance of articles for any particular candidate. So Google could tip an election. [?] asked. Which is a very different question, which is: Does that mean that search should be regulated? And made more transparent in our country? That is, that it is too important to be opaque and a trade secret.
8:33Russ: Of course, some people would like it that way. They'd like to get biased results, which is why they visit certain websites that Google would have noticed, etc.-- Guest: But they want biased results in the bias they want. Russ: Of course. Guest: This would be an example of biased results in some hidden bias you don't know what it is. Russ: Sounds like the good plot of a movie. Guest: It is. But you could see the same thing in advertising. What happens--I'm just making this [?]--Coke pays Google some amount of money to show better articles about Coke and worse articles about Pepsi? Perfectly reasonable business model. One we don't believe Google is engaging in right now. But they certainly could. And here it wouldn't tip an election. It would tip a market. Russ: Of course, we know we don't have to worry because Google's motto is 'Don't be evil.' [sarcasm--RR]. So there's nothing to worry about. That is very interesting. And I think, obviously, as you said, this is legal right now. Search engines, I'll call, 'proclivities', their proprietary algorithm--now if that came out it, I think it would be devastating to Google. It might not make it hard to come out; they could maybe cover it up. But I think what right now regulates that behavior is the impression we have--which may not be true--that if they did that it would be very costly. Guest: But we know that as regulation goes, that would be pretty crappy. I mean, we know that again and again, most recently in the 2008 financial crisis--that 'if this ever gets out, we'll be in trouble' is a really, really bad regulatory regime, and not one that's going to work-- Russ: I'm not so sure. Guest: and nothing resembling reasonable. Russ: I'm not so sure. That would be another topic. It actually works quite well most of the time. Guest: The problem is, 'it works quite well most of the time' is actually very bad. Since you have such major things. But I think-- Russ: I think the reason it fails has to do with other forces, not due to the basic idea. I think the basic idea is that your brand name and your reputation are quite powerful and important self-regulatory mechanisms. They are limited. I'll agree with that. Guest: But remember the whole corporations are people thing. You forget they are actually not people. It's not a corporation saying, say, My brand, Coke, is important. It's a guy saying, I'll get a bonus in 8 months if I do this thing; and when everything just completely explodes, it'll be someone else's problem. There's like--the incentives get very skewed. I mean, you have this weird corporations are people kind of feeling. Individuals making decisions, that affect larger entities. Russ: Agreed. Corporations usually have an incentive to monitor that misbehavior. I would argue in the financial sector we took out the natural forces that would give them that incentive, and they didn't monitor it; and there are a lot of people, a lot of rogue activity and rogue decisions that were made that were very destructive. But my argument is that we've destroyed the natural feedback loops that would have encouraged that to stop. Just to take an obvious example, if you were spending your own money, which they usually are not, they would have acted very differently. And the reason they were able to spend other people's money, partly because I think a failure of a different part of the regulatory regime. Guest: It depends who the 'they' is. The 'they' is someone in a corporation getting a salary, so he's never spending his own money. He's spending the money of his employer. Russ: That's true. Guest: There's no such thing as the corporation spending it's own money. Because it can't do that. The verb 'spending' only applies to human beings. Russ: Well, that's true. Guest: A building can't spend. People in a building can spend. And I think that's the confusion that I think people just miss a lot. Because it's a convenient shorthand to say, 'General Motors did this.' As if General Motors could do something. Russ: That's correct. It's a shorthand. It's often misleading, or uninformative. And the example that I'd like--I'm sure one of us will use it at least once during this podcast, is the word 'we' to describe something done by the government or the United States. Which is the same problem.
12:45Russ: I want to go to a different issue that you just touched on a minute ago, which is eavesdropping. You talked about how the powerful would like to manipulate individual fine-grained messages, fine-grain manipulation like throwing them out, etc. The next best is eavesdropping. How much eavesdropping is, do you know of, that's going on in the United States right now? Guest: A lot of this is secret, so we don't know. We do know that Google eavesdrops on everything you do. Facebook eavesdrops on everything you do. If you are browsing the web, there are tools you can use; and there's one tool called Collusion. I was reading some report, I think from New Yorker, was monitoring a tool and he found that 130 companies were eavesdropping on him as he wandered the Internet. So there's an enormous amount of eavesdropping built into the business model of the Internet. So that's one half of it. Second half of it is government eavesdropping. And really, we don't know. There's a belief that the National Security Agency (NSA) is eavesdropping on as much as they can. By law they are prohibited from eavesdropping on Americans, although that has gotten very fuzzy. We're not very sure what they are doing. They are building a large data center in Utah, which seems to be to collect, basically, what people are doing on the Internet so they can search it in real time or search it backwards in time. Exactly what they are collecting, we don't know exactly; how they are collecting we don't know. Now, the NSA eavesdropping on phone calls, that was a big scandal a few years ago. Other countries openly say they do it. Like, China, it's no secret they eavesdrop on the Internet. And the disturbing thing is that a lot of the tools go back and forth. So, tools that are built in the United States to facilitate corporate eavesdropping--so, for example, you might work for a large corporation and they don't want trade secrets leaving their network, so they put in something called data loss prevention, which is basically eavesdropping on what people are sending out of their particular accounts to make sure no corporate secrets are being leaked. That same technology can be sold to countries like Egypt to do the same thing to eavesdrop more generally. And there is this back and forth between government power and corporate power. And the same tools are doing both. Because exactly what happened to the United States, we know that we eavesdropped all the time at corporations--and government does get access to that data, either through warrants or through agreements. Sometimes they ask nicely. And again we don't have a lot of solid information about this. And also direct government eavesdropping. Russ: So, in the Boston marathon bombing, a set of surveillance technology was used to apprehend the bombers, or to identify the bombers. The alleged bombers. And some of that I assume was private. It was private in the sense that it was crowd-sourced--photos and videos that people at the time took. There were also surveillance cameras that were from local companies, Lord & Taylor supposedly had a-- Guest: And the willingly gave it to the government, perfectly reasonable--here, use this, see if you can find anything. Russ: And the government has its own surveillance cameras in certain places; I don't know if they were in that spot, but I think I'm right that in certain spots in the United States there's government surveillance cameras. And these give me the creeps, personally. Guest: Which ones? Both, I hope. Russ: Well, mostly, again the government ones, because it's much harder for me to opt out of government mistakes than private mistakes. Guest: Actually, it's not true. You cannot opt--let's go back to Boston. Russ: Sorry. Let me choose a different vocabulary. If Lord & Taylor took, say, ugly photographs--meaning photos of people making awkward faces, embarrassing clothing, whatever it was--and put them on the Internet for entertainment, people would get upset and some people would stop shopping at Lord & Taylor. It's a lot harder for me to leave the United States. That's all I meant by 'opt out.' I apologize--it was sloppy vocabulary. Guest: Yeah, I don't know. I guess I really dislike the: if government does it it's bad but if a corporation does it, it magically becomes good. Russ: I didn't say that. Guest: Yeah, but if a company decided to post embarrassing photos of people--I'm going to make this up: Lord & Taylor hires some guy to walk around Boston taking embarrassing photos. And some people won't shop there. Some people will shop there twice. Russ: Well, maybe. Guest: But the people who are having embarrassing photos taken have no say in this. You can't opt out. It's just as hard. Russ: I don't have any problem with legislation or laws that prevents people from using my picture, my work, or aspects to humiliate me or invade my privacy. I don't disagree with you there. Let's take a different word. Let's take censorship. So, if somebody's listening to this right now and they want to write an angry, obscene, and rude comment on this EconTalk podcast, we censor those. I don't like that word, and I won't use that word--we moderate and edit the comments to this podcast. Guest: I do the same thing on my blog. You must maintain a civil tone. You can disagree but just don't be a jerk about it. Russ: Correct. And people will often accuse us of censorship. And I want to reserve that word for the use of the power of the state which has a monopoly on coercive power--it can arrest me, put me in jail. I want to reserve that word to mean the control of information by the central authority. Now, I'm not going to disagree with you that sometimes, eavesdropping, private or public--or let's call it corporate or government--can be bad on both cases. But I don't think they are the same. Guest: Okay. I don't know. This gets into--there's probably a different argument that I'm not really equipped to have right now. They are closer than I think a lot of people think. This is the great libertarian fallacy, that it was a really good idea back in the mid-1700s that power is dangerous and we should look towards a society that equalizes power. But I think what that philosophy missed over the past few hundred years is that power has shifted. That is not just government power. That corporate power, in some ways, is a lot more powerful than government. I was reading this great book called Power, Inc., which talked about the rise of corporate power, and the points the author makes is that, imagine a climate change legislation: who do you think have more power--Bolivia or Exxon? It's not even close. Who has more power--the United States or Exxon? Well, it actually is close, and we're not sure. The nexus of power on our planet has shifted. And if we care about liberty we can't ignore that. It is, I think, just as hard to leave Facebook as it is to leave the city you live in because you don't like the government. Russ: Yep. I want to come back to that. Guest: It is so--these technologies are so pervasive and so embedded. Try living without a credit card. That's really, really hard. And unless credit cards compete, or whatever feature we are complaining about, basically we have no choice. And this to me is the interesting thing about power and security: how we find those in power, whether in government power or corporate power, are using security, using the Internet to solidify and increase power. And a lot of my main worries are when the two get together. Russ: We agree on that. Guest: If you think about it, there are two types of law in our society. There's Constitutional law, that constrains government. And there's--I don't know what to call it--regulatory law, tort law--that constrains corporations. And what we are seeing is that both groups have learned to bypass their limitations by using the other's law. And I think it's very dangerous. So the government is using corporations to get information it can't get otherwise. To get around limitations on what government can do. Really important limitations, they are going to corporations and getting help. Corporations, who are prohibited from doing certain things, are going to government and getting help. So, the movie industry tries to get government to pass a law to enforce its business model. That's something that seems like an absolute disaster. So, it's this back and forth that really worries me.
22:35Russ: I absolutely agree with you. And I think the main reason that Exxon is more powerful than Bolivia is that they've figured out a way to use the political power of the U.S. government. Guest: Right, they hacked it. Russ: Their economic power is very limited because I don't have to shop there. And that's the part of competition that's-- Guest: There's a lot more to economic power than you having to shop there. The externalities are enormous. And whether or not you shop there, they actually couldn't care less whether or not you shop there. It's not going to matter to their power. And it's much bigger than a country. Russ: Well, if nobody buys Exxon gasoline they are not going to have a lot of power. Guest: Yeah, but most people don't care. And they know that. Russ: Well, let's get to that. The question is: Why don't they care? I brought up the eavesdropping and the surveillance, and I brought up the Boston bomber--we got off on a digression there, but the reason I brought it up is that--I talked about the different kinds of surveillance that took place to identify the alleged bombers. We had people taking cell phone photos and regular photos, and we had surveillance photos from Lord & Taylor and other shoppers; and we could have had--I don't know if we had it in this actual case--government surveillance cameras. And I mentioned that they bothered me. You said I should be bothered by all of them. Perhaps. But the point I want to make is that most people aren't bothered by any of them. And in particular, this kind of case, where the case was cracked--it seemed to be; we don't know, they are the alleged bombers--but the suspects were identified very quickly--is going to encourage a lot of people--not me--to say: This is great. We have to have more of this. We have to have more surveillance, because we have to make sure that people don't get away with things like this. And so, I wanted you to talk about the role of the average person's lack of concern, which is what you were alluding to in this last comment, in facilitating this kind of increase in surveillance and eavesdropping. Guest: Let's take a more general question first. Yes, there's a lot of lack of concern. And the basic reason is diffuse versus concentrated interests. When you, me, anybody go about their day, we have a million and a half things we have to worry about. And Facebook eavesdropping and government surveillance cameras--all of those things are piled up with, I have to drop something off at the Post Office, and go grocery shopping. And it's very hard to get riled up about one particular issue because we've got so much to do. Meanwhile, on the other hand, the U.S. military, Google, Microsoft--they all have lobbyists. They all have people whose dedicated job is making sure their interests are being pushed. And it is extremely hard to deal with that imbalance. So, you have police that say: We love cameras. We get to sit in our police stations; we don't have to go outside where it's cold; the cameras do all the watching; and look at all this cool high tech. You have a camera industry that pushes all of this, that pushes cameras in schools--here's some free cameras, let's get people used to it; here's some great camera success stories. So, there's money in cameras, there's power in them. And that's hard to array against the general populace that's just busy with a hundred other things. Now, if you totaled up, I think, the anti-camera sentiment, it would be greater than the pro-camera forces, but it's just so diffuse you never get a chance to total it up. And that's fundamentally a problem. Russ: Isn't the other factor--it's a great point about I've got a million things to do and they're kind of focused. I think that's a general problem in legislation. Guest: In everything, right. Russ: And how it gets steered toward certain interests. But isn't it also that most people don't even see it as a negative? It's not that they are not aware of it. They see it as--I don't have anything to be ashamed of, I have nothing to hide, what's the big deal; the government is going to find the bad guys, and I'm one of the good guys. So I shouldn't worry. I think that's most people's attitude. Guest: I think there's a lot of that. I mean, there's a few things going on. The first is fear. When someone is actually afraid, they'll do pretty much anything not to be afraid any more. And if they are told: terrorists, terrorists, terrorists, fear, fear, fear--I know, we'll save you: Cameras. They'll say, great, save me. Read my mail, put cameras in, make sure there are no spectators at any marathon between now and the time--whatever dumb thing you are going to do. Just do it and make me feel safe. And because that message is being pushed--and it's a propaganda message--by government, by police, by the vendors of whatever technology is being sold-- Russ: Equipment. Guest: Right. And people not knowing better will believe because it sounds plausible. Cameras caught the bad guy, therefore cameras are good. We can argue whether cameras did catch the bad guy, and it's not obvious to me that they did. Or at least that the bad guys wouldn't have been caught otherwise. That cameras happened to catch the bad guy or were cameras necessary to catch the bad guy. Necessary is the important question. But this is a subtlety that is going to be lost in an average conversation. So the first thing is fear. The second thing is privacy, like any right, you tend to only notice it when it's gone. Russ: That's right. Guest: It's easy to say, I have nothing to hide. I'm asked that pretty regularly on the radio. And when someone says I have nothing to hide, why do I care? I'll say: What's your salary? And they'll say, um, um, um, um; and I'll say: See? Russ: And that's an easy one, too. Guest: Because something to hide isn't about illegal activity. It isn't about something I'm ashamed of. It's about how you present yourself to the world. It's not about secrecy versus non-secrecy. I will go to a doctor and take off my clothes, but it doesn't mean I'll do that-- Russ: on Facebook. Guest: On Facebook. And it's not because I have something to hide. It's because it's a different context. And our notions of privacy are very complex. And there's also, I think, a belief, and this again you don't notice till it's gone, that the powers are largely benevolent. Of course you don't care if the police read your email, because what do they care? And it's only in those scary regimes of the middle of the previous century where the police state did those nasty things. Except that is not true. It's true today in certain countries. And you and I know that when you give power--and this is actually true for government or corporate power--when you give power to an entity, you will have abuses. And the more power, the more abuses, and the more potential for abuses. And this is why you always temper power. That is also a very subtle argument. So, I think the basic reasons are multiple: that when people are scared, they're willing to not be scared; that the privacy arguments are subtle and hard to understand, and the negatives from lack of privacy you only notice when you are missing them. So that's the real combination that makes this a difficult conversation. Russ: Yeah, and then I think you get to a--you are on a slippery slope potentially. Right now, we are in the middle of a--I don't know how serious a scandal it's going to turn out to be, but there is now some evidence that the Internal Revenue Service (IRS) may have used political considerations in investigating various tax-exempt organizations. Guest: And those accusations are--I mean, how old are they in our country? A century? More? Russ: Sure. Guest: Every time. Power is tempting. You are sitting there, you are in power, you have this lever. It's going to be really hard to say, that would be wrong. Because you are skewed. You are doing it for what you believe is some greater good. This is the same reason we are torturing people. We were blind to that it was a really, really bad idea.
31:33Russ: So, let's talk about this corporate issue. You put it in a very interesting way in a recent essay. You talked about feudalism. And again that might be a little over-dramatic, but why don't you describe what you mean by that and flesh out some of the points you were making earlier about how we depend on Facebook, Google, Amazon, etc. Guest: This is sort of an effect of the way technology is moving. We're are moving toward less control over our stuff. We are putting more and more things in the cloud. And this really has to do with the cost of data storage versus the cost of data transport versus the ease of access. Now it turns out for most people to put their email on some cloud provider--let's say Gmail or Hotmail; to put their photos on Flickr or some other photo app. Russ: Dropbox. Guest: Google docs, address book somewhere, calendar somewhere. This is just a lot easier. Russ: It's fabulous. Guest: Oh, it is absolutely fabulous. All these things are being done for really good reasons. At the same time, we're losing control over our consumer devices. So, if you have an iPhone, you have a lot less control than you had over a Windows box. For an iPhone, all software has to be approved by Apple; it's only sold out of the Apple store. Updates are approved. If you have a Kindle you don't even have a choice whether you can load an update--it happens automatically. And basically an observation is that people--and I see this in my friends--pledge allegiance to a particular company. So I have a friend who has a Google phone. And she has her calendar on Google, and her email on Google, and her address book. And it's all fantastic. It works great. But she is essentially-- Russ: She's a vassal. Guest: She's a vassal of Google. Other people can do the same thing to Apple. Microsoft is trying to get in that position, to be the one stop shop for all of your stuff. And at those points, we are vassals--probably a good word--or serfs depending on the way you want to take the metaphor--of these companies. And the reason I call it 'feudalism' is that those companies don't necessarily have your best interests at heart. In a lot of cases you are not even their customers. They are offering you this service because they want you as part of their product they sell to their customers. So they are enticing you into their system to give them all of this data and information that they can basically sell behind your back without your knowledge or consent. In theory you know, but you ask most people, no one's going to read the agreement they are supposed to agree to. Russ: Never. Guest: And a lot of people don't actually know. And when I look at feudalism--there's a really interesting article, I'm not sure I buy, but I'm going to present it because I think it's really interesting. I was reading a book, Rebecca MacKinnon, Consent of the Networked. She didn't use the feudal metaphor. What she pointed out is that, you know, back in the feudal era, powerful--and these were essentially governments, these were lords, had a lot of rights over their vassals and serfs. But basically no actual responsibilities. They could renege on their agreements all the time. If you watch Game of Thrones, you see how it works. When anything bad happens, the serfs get completely pummeled. They have no rights and they are just collateral damage in these large battles that are happening. And it took the Magna Carta, where the serfs, the people got to say: Hey, you governments, you don't just get rights; you have responsibilities as well. And right now, with these large companies, these Apples, Googles, Facebooks, it's very much the same thing. They have rights but no responsibilities. They can do what they want. And your only recourse--and it's made literally hard to do--is to leave. If you wanted to leave Facebook, you can't take your data with. You can say: Here, here's all the stuff I've given you to date, and I'm going to walk away. Although depending on what social group you are with, that's actually very hard--you don't get invited to parties any more, you lose all your friends. The network effect and the lock-in in so great that often being able to leave is more illusionary than real. The social penalty is enormous. And of course, these companies count on that. That's part of the deal. That's why you try to be so big, so it's so valuable for someone to be there. And the idea is: Do we need something Magna Carta-like to force these corporations to have responsibilities? So, you see some of this in Europe, that there is a right to inspect the data that they have about you. A right to be notified if it's being sold to a third party. The right to delete it; the right to take it with you when you leave. Now, I can make up some of this stuff that sound plausible, and do we need to establish these rights against these new nexuses of power that have emerged, and are highly coercive, and in some cases highly abusive; and in all cases a high potentially abusive? Russ: I don't know if I want to agree with the coercive part. Guest: Yeah. I guess [?]. I get that you wouldn't want to agree with that. Russ: I heard you hesitate there. But it's certainly true that there are large costs to leaving these fiefdoms that we've embedded ourselves in. We've chosen to embed ourselves. I think the biggest difference between the--the problem with the metaphor is that there weren't that many vassals who wore a t-shirt advertising how great their feudal lord was. The fact that people wear t-shirts and love being part of these groups suggests that, as you say, it's convenient; right now, it doesn't seem to be abusive. It may be right now and it certainly has the potential to be--I agree with you there. And then I have this worry that as we through the government, that issue you raised earlier is going to come to the fore, which is the ability of the players to manipulate that system. I prefer an end around. Let me just suggest a couple of ways we might get to a different world. One is that you could start a company that had a different set of default and opt in options. Guest: Well, be careful. Lock-in is a big deal. Russ: Oh, I agree. Guest: You could start a competing Facebook, but if nobody's on it--we're back to most people don't care most of the time. Russ: Right. Guest: I'm not on Facebook, because I'm a freak. But this is increasingly hard. I remember many years ago, I would have regular parties--I would send postcards to friends. And there was a time when I swapped from postcards to email. Pretty much everyone had email. And there were a few people that didn't have email addresses back then, and I would have to remember to send them a postcard. And I would invariably forget. So those that weren't on that technology effectively stopped being invited to my stuff. Not on purpose. Russ: I understand. Guest: Because that's the way the world works. These days--I'm noticing that I'm not being invited to things because I'm not on Facebook. There is a social penalty not being on Facebook. Insert[?] DuckDuckGo.com is a competing search engine, and their business model is: We do not collect your data. They are very nichy--because we are back to most people don't know the problem, most people don't care. And because they are so nichy, they are getting less revenue; their search engine is less good. The powerful accrete power, and it's hard for an upstart to break in. It's not an easy competitive market like building chairs in an open air marketplace. Russ: Right. But don't you think it would be easier for DuckDuckGo to gather customers if Google becomes more abusive? Guest: Maybe. It depends. It depends whether it's publicized. If Google is in charge of getting people to read articles of how abusive Google is, then maybe not. There is a lot of stuff going on here. Russ: It's true. Guest: If we ratchet it up slowly, people aren't going to notice. There's a lot of ways to play this game. And there's sort of another thread that we could pull in--that we as a species, as science, are getting so good at psychological manipulation. We are putting people into brain scanners and showing them political and advertising messages. And measuring how good the results are. We are getting so good at manipulation that persuasion, whether it's political persuasion or economic persuasion--buy my product--has gotten to the point where it's almost an unfair trade practice. So I'm not sure we have a clean market here and a better produce rises or the better candidate rises to the top. Russ: It's imperfect. Guest: It's a scarily imperfect market at this point.
41:44Russ: Doc Searls, in a recent podcast we did on his book, The Intention Economy-- Guest: Great book. I enjoyed reading it. Russ: Fascinating book. So, he wants to create a world where buyers signal their intentions; sellers compete for those intentions. He wants a world where we control our data; that's the default rather than the opt-in. Given how worried you are, concerned you are about these issues, given how relatively unconcerned the average person is--I'm somewhere between you and the average person--how do we get there from here? Other than the regulatory strategy, which is not what Doc Searls is pursuing, I think. Different business model. Guest: Yeah, he's pursuing a market strategy. What he's saying is: Look, I can build technology that will empower individuals by putting them in ad hoc groups. It might [?] be very much about the buyers' clubs--those in Japan; they turned into what Costco and companies started like--that buyers would aggregate and give themselves more power. And I think this is great. I think his idea that you can do this with technology is very clever. And if he can, it would be great. I would worry about a backlash. Think about Priceline--had this idea when they started: we're going to let buyers name their own price. And that turned into: We're going to sell tickets on sale and pretend buyers name their own price. But maybe the technology wasn't there yet. Because what Searls is talking about is a more nuanced, clever technology to aggregate demand and to allow buyers to establish their contract requirements so it's not just a one-sided here's-your-clickthrough agreement, you either can click or not. That there could be actually some negotiation. And this would happen by agents automatically, so it would scale. There's a lot of great stuff here. So this certainly could work. I like hearing him speak; I'm cautiously optimistic about what he's doing. Here's a way that technology those with less power to get more power. I think you are going to have the powerful fighting back. Let's take an easy instance--the airlines. The top four airlines say: Well, we're not going to use that system. Suddenly you can either fly on a minor airline that's inconvenient, doesn't get you there; or you can't use it. So, if you have a small number of dominant players, they could fight back, they could somehow regulate this tool out of existence. And I don't know how they'd do it. We see this with small agriculture--big agriculture gets laws passed that really hurt small farmers because they don't want the competition. Russ: It's not just agriculture. Large firms love regulations that raise the cost to all firms. Guest: That was just an example. Russ: Because the smaller ones have trouble competing. Guest: So, I would worry about fighting back, using both corporate and government tools. It is a good--here's a great example of how the Internet can change a power balance. But what remains to be seen, when all is talking about how different groups are using power, is where the balance ends up. Early on you talked about radio. Actually, radio started as completely decentralized, completely for the powerless, co-opted by governments because of limited spectrum; and they had a whole bunch of reasons why they had to control the air waves. In certain countries they controlled it even more; it became a propaganda tool. But it also was a tool of empowerment, of bringing, you know, Radio Free Europe, bringing messages to people who didn't have them. So, when you look back on the history of radio, you probably could trace how radio affected power balance and where it ended up. We're at the early days of that same graph of the Internet, and where it's going to end up--I kind of want to say it's anybody's guess. Maybe I'm giving short shrift to how much we can predict, but it seems really hard to me. So the early days of thinking about this. But all of these things; it's going to be back and forth; it's going to be the powerful using their power to keep the status quo as much as possible; the powerless being nimble, trying to do end runs and run arounds. It's really interesting to watch.
46:32Russ: I agree. To me, the other issue that's fascinating how the balance is going to work out is how much people care about it. And I think that is even more important in how we look at terrorism and security issues there. I'd like to use our last few minutes to talk about that, as you've said so many interesting and provocative things there. You've been very critical of how we have used our resources to fight terrorism. What are the major mistakes you think we've made there? And again, as you pointed out earlier, a lot of these mistakes are driven by fear and preferences of the average American who is not particularly worried about the abuse of power right now. That could change. It's starting to change. I think there's a bigger awareness of it. But the combination of fear and a trust of what the government does has engendered a lot of practices that seem to me--and you've pointed this out--that are not very effective. Guest: So, I'll talk about two major mistakes. I could spend an hour on this topic. The first one is we over-exaggerate the threat. And in a lot of ways this is an effect of the psychology of terrorism--that it's big, it's spectacular. The media repeats it endlessly. And in our brains we think it's a much larger problem than it is. We don't say things like: well, every month a 9/11's worth of people die in car crashes in the United States. We don't say that pigs kill more people than terrorists every year. We believe terrorism is this huge problem and needs an inordinate amount of security and spending to mitigate. So I think that's the first thing we get badly wrong. The second is that we worry about the specifics of what happened rather than the generalities of what could happen. So, we worry about terrorists taking over airplanes with box cutters. I mean, right now we're worried about finish lines of marathons. It's almost magical thinking, that we somehow have to secure the finish lines at marathons in this country. Because that's what the terrorists did last time, and obviously that's the place of worry. We see this in airplane security. Think of the history. We take away guns and bombs, they use box cutters. We take away box cutters, they put a bomb in their shoes. We screen shoes, they use liquids. We take away liquids, they put a bomb in their underwear. We put in full body scanners, they are going to do something else. Again, this overly specific focus on the details of the plot rather than the broad generalities. Those are the two major mistakes. Russ: I will say--I did fly yesterday, and I got to keep my belt on. It was a big day. It's one of the strangest--it's this strange form of theater, that if I take off my belt and my shoes for 30 seconds and let them pass through a metal detector, I'm somehow going to be safer. They should just have an incantation. You made a fabulous point in the aftermath of the Boston bombing that resonates with a lot of themes in this program, which is: after every event like this, and it's true of natural disasters as well, but it's particularly true of terrorism, there's always signs that were missed. And people then--there's recriminations--the phrase you used, which I love, which is: People complain, why didn't we connect the dots? What's your answer. Guest: Well, that it's a crappy metaphor. The connect-the-dots metaphor--you know what a connect-the-dots picture looks like. There are a bunch of dots; they are all numbered; you connect them and you've got a picture of a duck. But that's not the way intelligence works. Intelligence is, you have a million pieces of information; they are unnumbered; you don't know if any of them mean anything; and you are supposed to pick a terrorist plot up out of it. It's a very different analysis. And getting back to psychology, there is something called hindsight bias. That we as people, we over-estimate how obvious something was after the fact, is the best way of putting it. After the home team wins the football game, we all say, Well, it was obvious that they would have won, and we list all the reasons. And if the home team lost the football game, we would have had the opposite conversation: It's obvious they would have lost and here's all the reasons. Well, it turns out it's only obvious in hindsight. Once you know the story, it's easy to pick out the pieces that make a good story. But before the fact it is extremely difficult. And this is important: things that are perfectly reasonable to do at the time might seem irresponsible in hindsight. And that is a bias. Russ: And there's no way to avoid it. Guest: Connecting the dots--it's not connecting the dots. It's finding a needle in a haystack. That's the correct metaphor.
51:50Russ: So, given that problem, which I think you pointed out the number of people on some watch list was 700,000--the amount of resources that we take-- Guest: Right. We can't actually watch them all. Russ: We could, but we would be very, very poor as a society and as individuals. What's the right way to think about how to respond to that reality? Because I think the natural impulse I think a lot of people have--which is I think wrong--is: We just have to fix it. We need to reduce the risk. We need to, like you say: Now we just have to have marathon finish lines, that's all; and to have body scanners; we'll just cordon off Boston next time at the Boston marathon and to get into the city you'll just have to go through the scanner and da, da, da, da. And if you do that, besides the quality of life dropping off and the worries about surveillance and the abuse of power, it's really, really, really expensive. So, what do you recommend should be our response as citizens and voters to this problem that you point out, correctly, is slightly if not greatly overly exaggerated, and our ability to stop it is also exaggerated? Guest: I think we have to accept the risk. We accept the risk of getting in a car. It's really hard to find something where driving there isn't the most dangerous part of the activity. Maybe sky diving. By far the taxi ride to the airport is the most dangerous part of a plane ride. The drive to the marathon is the most dangerous part of the day. So we are able to accept risks. We tend to accept risks that are normal parts of our life. Terrorism is rare and spectacular; plane crashes are rare and spectacular. So we over-exaggerate them. The proper response is to accept that life entails risk. And that's okay. That's not bad; that's not a failing; that's not something to fix. That is part of being in the world. For many years in our country we have recognized that the price of liberty is the possibility of crime. We deliberately reduce police power because we have a better society because of it, even though the occasional criminal gets through. Those sorts of tradeoffs, those sorts of acceptances, become harder as we live in a world where risk systematically gets removed. Where medical science, where product safety--where all of these things reduce risk, suddenly we look at our residual risk and we are aghast. What do you mean, we haven't fixed terrorism? We have warning signs on ladders, for heaven's sake. Right? We don't allow children to swim in pools unattended any more. We know better. What do you mean we can't fix terrorism? Go fix it. That's a perfectly reasonable reaction in a society that has just gotten rid of risk after risk after risk. Here's another one; just get rid of it. Can't I take a medication to get rid of this risk like I do with all the other ones? We need technology to save us. Russ: The other side would argue, and I'm not sympathetic to this, but it is possibly true: the other side would say, well, look, it's true we spend all this money, we've implemented these last-war[?] efforts, fighting the last war--the box cutter, the underwear, the liquids, etc. But look how well it's worked. And the only reason your strategy of acceptance is okay is because it is only a few thousand people over the last decade and a half. If it were every month, then it would be a serious problem. But the reason it's not every month is because we've stopped all these plots along the way. What's your answer to that? Guest: There's a couple of things. One, car crashes are every month and it's not a serious problem. So it's not obvious that it would be a serious problem. It's possible we as a society would be more accepting, like car crashes, if it happened every month because it would be normal. It would be weird, but that's possible. The other thing is: because terrorism is so rare, it's really hard to produce the argument of: we haven't had any terrorist attacks, therefore things are good. If it's like crime, which might happen several times a night, we can notice if we produce a security countermeasure, whether it increases or reduces the crime rate. If it's something like meteor strikes against the planet, they happen, what, once every 800 years, 1000 years--it's really hard to judge whether your security works because there aren't enough instances. I remember a couple of years after 9/11, and then-Attorney General Ashcroft was actually in my home city of Minneapolis giving a speech. And one of the things he said was: It's been 2 years since 9/11 and there haven't been any more terrorist attacks; and that's proof my policies are working. And I was listening to him; and I thought: Well, there were no terrorist attacks in the two years before 9/11, and you didn't have any policies. What does that prove? It proves that terrorist attacks are really, really rare. So, you can't judge rare events on probability of incidence. Because there isn't enough data to plot any meaningful trends. Again, this is a subtle argument to make to a layman. What Ashcroft said, everyone is probably [?]--wow, he's right. Russ: Sounds good. Guest: We expected a terrorist attack every week and there wasn't--what great things he's done. But it's not necessarily true. Russ: But you have to be honest: The Federal Bureau of Investigation (FBI) and the Bush Administration and the Obama Administration would all tell you about all the plots and conspiracies that they've stopped; and therefore the price of security is eternal vigilance. We need to spend all this money for surveillance, etc. Do you think we've stopped any of these? Are they real? Guest: You know, most of them aren't real. There were a couple that were real. Remember that the car full of explosives, the New York Times, that was stopped by a hot dog vendor who said: That car shouldn't be smoking that way; I better call the police. The bomb wouldn't have exploded; I think it was a faulty bomb; but I think it was a real attack. Most of the plots--and if you remember, the guys who were going to blow up the Brooklyn Bridge or topple the Sears Tower or make the fuel pipelines at JFK Airport explode--most of those, and there are some really good websites that look at all of these plots, were not real. The plotters had no idea what they were doing; they had no access to weaponry; they couldn't have pulled it off. It was mostly either FBI informants or FBI undercover agents goading the plotters and then arresting them. And I worry, because this stuff plays really good on television, that we are actually creating terrorists, to make it look like there is more of a risk than there actually is. John Mueller--he's the person who has done the research-- Russ: He's a future guest on EconTalk. Guest: Yeah. I asked him about it. He's done some great work on this. Russ: We will talk about that.
59:00Russ: But your view, then is that we should just live with it. Guest: It's kind of in the noise. It's not like, you know, product safety. It's not like climate change or [?] quality. Or car crashes. Or serious stuff I'm worried about. Russ: Would you get rid of all airport security? Would you get rid of all wire-tapping surveillance? Infiltration? All the stuff? Guest: 'All' is a big word. I certainly would get rid of most. I mean, right now I think we are just doing it largely wrong. That the things you want to spend money on are investigation, intelligence, and emergency response. That those, that focusing on tactics and targets is overly specific, and is money that only makes sense if you guess the plot correctly. I mean, if you spend billions of dollars on the Transportation Safety Administration (TSA), and the terrorists respond at a marathon, you've wasted the money because you've guessed wrong. Russ: Well, that's a big issue. Guest: Stuff that doesn't require you to guess--think of the way the liquid bombers were apprehended. They weren't arrested in their London apartments. They chose a plot deliberately to get around airport security. And they were arrested before they got to the airport, through traditional following the leads. And that's the sort of thing that works. Emergency response works great in all sorts of disasters, I think that's real important, both natural and man-made. But a lot of the overly specific measures I think are a mistake. Airplanes are an exception. I mean, the characteristics of an airplane requires some extra security. I mean, the miracle of Boston is that the inverse square law is your friend. Right? The force of the explosion decays on the square of the distance. I mean, so few people died. You put that same bomb on an airplane and the characteristics of the plane means the plane would crash. It's not the bomb that kills people. It's the fact that the plane falls out of the sky that kills people. So you always need some extra security because of the way a plane works. Russ: What fascinates me--and this is the point you make very well in a couple of your recent essays--is: It's interesting how difficult it is to rationally process how few people were killed in Boston. Guest: Yeah, yeah, yeah. Russ: And I say this with total--every death is a horrible tragedy, to dozens and hundreds of people whose lives were touched. And by the way, it drives me nuts when people say when you ask about Boston, it will never be the same. The Boston Marathon next year will never be the same. And that's not true. Most of us will be the same. The people whose lives will never be the same-- Guest: Unfortunately the Boston Marathon might not be the same for a long period of time. Because the elites will panic. Russ: It depends. But my point is, for the people who lost their lives, those familes who were touched--their lives will never be the same. We're overdramatizing. It's absurd. It's bizarre. But the fact is that there is this tremendous emotional response that's distinct from a bad car crash. And part of it is because you choose to get in the car; and part of it is because the car serves another purpose. The idea that you are vulnerable suddenly in a way that you weren't vulnerable before is very difficult, I think for all of us, to accept. And as a result we ask for things and are willing to spend things and do things that are really not productive. And it's just very, very hard for us to just say, well, as you say, it's a small probability, we'll just live with it. And instead we say: What can we do to stop it? Guest: Right. How can we make this better? I'm scared; make me not scared.