This week's guest:
This week's focus:
Additional ideas and people mentioned in this podcast episode: A few more readings and background resources:
A few more EconTalk podcast episodes:
|Time||Podcast Episode Highlights|
Intro. [Recording date: June 14, 2019.]
Russ Roberts: My guest is... Shoshana Zuboff.... Her latest book and the subject of today's episode is The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power.... What is surveillance capitalism?
Shoshana Zuboff: Well, let's begin with a definition of surveillance capitalism. And, for that I'm going to provide a little bit of description for our listeners. One of the things I write about is ways in which surveillance capitalism diverges from the history of capitalism and some of its well-known elements. But, there is a very important way in which surveillance capitalism emulates this history, and that's where I'd like to begin. So, those who study capitalism have long described a process by which capitalism evolves, claiming things that live outside the market dynamic, and bringing them into the market dynamic, creating commodities that can be sold and purchased. So, famously, industrial capitalism, broadly speaking, claims nature for the market dynamic, reborn as real estate, as land, that can be sold and purchased. Well, surveillance capitalism emulates this pattern, but with something that is an unexpected and even dark twist. And that is: Surveillance capitalism claims private human experience for the market dynamic. And that private human experience is reinterpreted as a free source of raw material for translation into behavioral data. So, this is how it works. These behavioral data, some of those flows, are fed bad to improve services, products. But there are other data streams that are hived off. And these are selected for their rich, predictive signals. And these parallel data streams are what I call 'behavioral surplus.' Because these are behavioral data that are more than what is required to improve services and products. These data streams, behavioral surplus with their rich behavioral signals, are then flowing into supply chains--think of pipes, think of conveyor belts--that converge these data streams with the new means of production. The production facilities, in this case, are computational. They are what we refer to as 'machine intelligence,' 'artificial intelligence.' So, now we have behavioral surplus with its rich predictive signals converging with advanced computational capabilities, production. And, of course, out of this comes: Products. What are these products? These are computational products. But the simplest way to describe them is that they are computational products that predict human behavior. I call them prediction products.
Russ Roberts: Give us an example of that.
Shoshana Zuboff: Well, the first and most famous, widely known, prediction product is what was called the click-through rate. The click-through rate, we think of it as confined to online targeted advertising. But, in fact, the click-through rate, when you just zoom out a little bit, it's easy to see that it is a computational fragment that is a prediction of human behavior, where people are likely to click in relation to certain kinds of ad content. So, the click-through rate was the first widely-successful prediction product. In a similar way, I draw an analogy to the invention of mass production a century ago. Where, the whole mass production logic of high volume and low unit cost--there were elements of this that, you know, existed in a variety of different organizational settings even in the late 19th century. But, the whole, the whole comprehensive puzzle finally came together, the Ford Motor Company, early in the 20th century. And, of course, the first great, most famous, most successful product in mass production was the Model T Ford. And of course, looking back on it, it's obvious that the mass production--mass production as an economic logic--was not limited to the Ford Motor Company or the fabrication of the Model T Ford. That was an economic logic that had legs. It could be applied to anything and over time it was applied to just about everything, from hospitals and schools to factory production. Okay. So, we have a similar situation now, where the click-through rate was the Model T of this era, if you will--the first wildly successful prediction product. But, surveillance capitalism is no more constrained to the online targeted advertising market than mass production was constrained to the Model T. So, that's an example of a prediction product. And, who buys these products? Well, these products are not fed back to the populations from which the raw material for their fabrication was initially derived. These products are sold to business customers who have an interest in what we will do, not only now but soon and later. So these are business customers who have an interest in our future behavior. And, they constitute a new kind of marketplace that trades exclusively in human futures: prediction products. And again, the first well-established markets in this vein were the online targeted advertising markets, which as you know have grown substantially, have produced a great deal of revenue for, especially, the two pioneering surveillance capitalists who pretty much own those markets. Still.
Russ Roberts: Google and Facebook.
Russ Roberts: I want to--I agree with some of your concerns in this book. It's a book of concern, I would argue. It's a little bit frightening. And I think there are things to be frightened about. But, on the surface, I want to at least start by pushing back. I'll agree with you some, later, on this. But, what's wrong with that? So, for example, because of the data that I provide, often unintentionally, to Google or Facebook or Twitter or whoever, they know things about me. They know what I search for; they know what I buy, perhaps. If it's Amazon they know a lot about what I buy. And so they are able to tailor what I see based on my behavior. They can sell the right to get access to my clicking of [?] folks. And when that comes across my screen I can choose to buy it or not. It can be annoying. I bought a watch this year, so I did a lot of searches for watches when I was still using Google. Now I use DuckDuckGo, but in those days I used Google, and so I started getting ads for watches, most of which I wasn't interested in. And occasionally, maybe I looked at one of them--I can't remember--but after I bought a watch, I kept getting the ads because they didn't know I'd bought the watch--either because they weren't paying attention; the algorithm doesn't know; or, I think I bought it actually in a face-to-face store. I can't remember actually, but if I did, obviously that would make it harder for them to know about it. And that was annoying. But it wasn't scary. In fact, I could have been happy about it. Often when I travel Google knows where I'm going, because my Gmail has the receipt for the airfare, and it will suggest places to go to when I get there. I find that a little bit creepy. And, I don't--and kind of cool, that they anticipate my experiences. But, I can choose to use it or not. So, why does it--what is frightening about this? Why is it a source of concern?
Shoshana Zuboff: Yeah. Well, I like the word 'concern.' You know, 'frightening' is perhaps not the word I would use, in the sense of, you know--frightening is when there is a knock on your door at 3 o'clock in the morning and the soldiers in jackboots are there to drag you away to the Gulag or the concentration camp. That's frightening. This is not that, Russ. This is a different form of power. And, as you know, I go to some lengths to distinguish this power from, you know, the kind of power that our world came face to face with in the 20th century when we confronted the totalitarian nightmares, both in Germany and in the Soviet Union. You know, totalitarianism is a form of power that rules by terror. It rules by--
Russ Roberts: But you paint a very dark picture of this turn in our economic activity.
Shoshana Zuboff: No, I do indeed, but my point here is that this is a little bit more--it's--grasping what is at stake here and what is of concern, you know, requires a different sensibility from pure fright.
Russ Roberts: Fair enough.
Shoshana Zuboff: Because, in the confrontation with totalitarian power, fright was essentially the means of social control. And that's not what's going on in this new world. The means of social control here has to do with dependency and identification and the foreclosure of alternatives. There are other channels here for social control; and the erosion of human agency. So, let's talk about that a little bit. What are the real causes for concern? So, I've just described to you a sequence and an economic logic from the unilateral claiming a private human experience ultimately to be sold, quite profitably, in markets that trade in human futures. And, you point out that, you know, these--the first experience that all of us had of this new logic takes place pretty much in the online environment with targeted ads. And the funny thing about your description is that, if this--we're speaking right now in the year 2019. If we were having this conversation in the year 2005, even--2004, 2005--I would bet you that our conversation would be different. Because, back then, you know, folks experiencing what you just described felt really, really uncomfortable with it. And disturbed. You will recall, because I know you study this, that in 2004 when Google came out with Gmail, which was a massive step forward in email functionality, you know, and storage--because stuff was going to be stored on the cloud and not on your computer, and so forth--ability to search within your emails; all kinds of things that were breakthroughs for the whole email domain. And then, suddenly, people were seeing ads, like the ones you are talking about, Russ. But they were ads that were triggered by something that you wrote in an email.
Russ Roberts: Yup.
Shoshana Zuboff: And very quickly, the ball of yarn unraveled, and it became clear that Google--and we haven't gotten into our economic comparatives yet; and we'll talk about that in a minute--but, Google was now using email content as a new source of raw material, as I've described, from which it would be able to scrape predictive signals. And these now email messages--all the email content that we produce--was now feeding these supply chains of behavioral surplus. That's essentially what was happening. And of course all of this occurs--and this is where we compare it to the jackboots--all of this occurs remotely. All of this occurs through the medium of digital architectures. It's not, you know, somebody coming and threatening you with a gun. So, it's all happening in this remote, robotized medium. Nevertheless, the yarn unravels, and it quickly becomes clear that, 'These guys are scraping our emails for more behavioral surplus to target ads.' And people all over the world, as you will recall, Russ, were mobilized in outrage. Such was the outrage that an important California legislator immediately drew up legislation to outlaw this unilateral taking of private human experience. What I just wrote to my mom in an email. Okay. Now, what happened? The Google team mobilized. And it created a war room. And it was there in the heat of that Gmail contest they first developed the strategies and the sequence of tactical operations that they would use over and over again across the last 15 years, Russ. And that operation goes like this. The first thing is, what I call, Step One: Incursion. In other words, they take what they want. They take your email; when it came to Street View [trademark via Google Maps], they take your house. They can get WiFi data out of your house, they take that, too. They just take what they want until somebody tries to stop them. When somebody tries to stop them, they go into Phase Two, which is called: Habituation. Habituation is, they try to draw out the process of contest for as long as possible. During that time, they are trying to--they are using backstage operations to try and stop any real impediment. But, what they are doing publicly is they are explaining, they are apologizing; they are saying, 'Well, we'll do this. We'll do that.' And they, you know, they may be fielding court cases. They may be fielding hundreds of court cases. They let those court cases, you know, drag on for as long as possible. And suddenly the weeks and months and years start drifting by. It's at the end of that time, there is still any kind of protest, they will make some superficial adaptations. In many cases, by the end of this habituation period, people can no longer remember why they were so upset in the first place. Because, this is what happens. Habituation sets in. You know, things that were, that violated norms, that violated boundaries, that we greeted as outrageous, you know, they become fixed things in our life. It's like we're seeing that so much in the political domain now. You know, boundaries that could never be crossed. Norms that could never be violated. Now, when they are violated routinely, we grow numb. This is called psychic numbing. We grow numb to these outrages. And that is what allows habituation and normalization to set in.
Russ Roberts: Well, let me challenge that a little bit, or give a different, maybe, framework for it. Okay? And let you tell me why this is not representative. So, I am alarmed by some of this, particularly in the political realm; and I think, you know, the ability of Facebook, Twitter, etc. to control what I see and read is very concerning. And robots and other things that stir up anger and tribalism, I think, are not good for our country. And how we deal with that is a very tough question; and we are going to come to that, I hope, toward the end. But let me give you a different framework for this. So, my washing machine breaks. Now, a washing machine is an expensive thing to repair. I may need a new one. It may not be worth repairing. But let's say a guy--there's a new service available. And, what the guy says is, 'Here's the deal: I'm going to come to your house. I'm going to either fix your washing machine or give you a new one. No charge.' Wow! Fantastic. 'The only thing, though, is that when I'm in your house, I'm going to take some pictures of the inside of your house, and I'm going to learn about you. And I'm going to sell some of that information to other people to help send you targeted ads and other things.' So, they come into my house; they find other things that are broken. They may see that I love wine or that I like to read or that I'm a photographer. And so I start getting, in my mail, I start getting ads for cameras and ads for wine. I can get over the Internet and special book offers. And so on. And then they say, 'Oh, by the way, I'll also,' let's say, 'I'm going to give you your choice of free wine every week. But if you are going to, I'm going to have to--to get that service, you are going to have to let me see who you mail stuff to.' Okay? And they start doing that. And eventually, yah, I love inexpensive wine, or I love wine without charge; I love having my washing machine repaired with no charge or getting a new one. And these incursions into my privacy of taking a few pictures--you know--I don't leave everything out, obviously, when I know the washing machine guy is coming. But, they do take advantage of the fact that they wander through my house. And there is something deeply creepy about it, right? And I don't--and, by the way--I should have made it even clearer: They don't really tell me that's what they're going to do. They just do it. So, I just think--
Shoshana Zuboff: Oh, okay. That's a big difference.
Russ Roberts: Right. Of course. I get--so it's like I'm getting a free washing machine. And then I notice I'm getting a lot of ads for cameras. And I think, 'I guess that washing machine guy used the chance to be in my house to notice I have a lot of photographs up. He knows I like photography.' Etc., etc. And so, I have a choice, right now, to use these services. Now, I agree it's really hard to give them up. I am habituated to them. I love them. There are many of them. Not all of them. Many of them I love, though. I love Waze [owned by Google--Econlib Ed.]. It's just, it's really improved the quality of my life. I really enjoy it. There are many other things I like about Gmail. It is a fantastic service. And the deal--and here's part of the problem--was that it was never an explicit deal--the implicit deal was, 'Well, by the way, while you are using these "free services," they aren't really free because, lots of reasons. I'm going to use your data. I'm going to charge advertisers for the opportunity to buy, to get access to you.' Which means the price could be higher than it otherwise would be. But, I have a choice. And, if you ask me, how do you like it: Well, I don't like it that they take pictures of my house when I'm in here. It is kind of creepy. But I do like the services a lot. And here's the key point: Nothing really horrible has happened, so I put up with it. And I think it's basically a good deal for me. And I would argue that, for young Americans in particular listening to this program--of which there are many--they think a lot of this concern about privacy and data--some of them, people, don't even get it. But, for me, it bothers me. I find it alarming, and particularly in the political realm. Because it could change how people vote, and what they do. But, is it really as, as frightening--again, as concerning? Isn't it mainly successful not because they have power but because I've given them that power? Because I like the deal?
Shoshana Zuboff: Heh, heh, heh. All right. Well, let's talk about two things. If--because I need to lay out some terms of reference so I can really answer your question. The first thing I want to talk about is what puts the surveillance in surveillance capitalism. Why is it called 'surveillance capitalism'? And the second thing is: Let's talk about the economic imperatives here. Because, unless our listeners, and especially our young people who are listening--and if you are listening to this--Listen up close. Because this is about your future. And, uh, you are the person that I'm most concerned about in this conversation. So, I want to talk about the economic imperatives. And I think that will give us a framework, Russ, for then coming back to your example and looking at it through some other lenses. Is that okay with you?
Russ Roberts: Sure. Go ahead.
Shoshana Zuboff: Okay. So, the first thing I want you to know is: Why is it called 'surveillance capitalism'? So, this takes us back to the early invention of surveillance capitalism. At Google. I don't need to go into all the details of this origin story, Russ. If you don't need me to. Suffice to say that, *cough, cough* in the heat of financial emergency, during the dot com meltdown in the early 21st century, the years 2000, 2001, Google, like most other startups in Silicon Valley, faced tremendous investor pressure. Startups were going bust left and right. Really smart people were losing their businesses. They could not monetize fast enough for the demands of the impatient capital represented by the, um, the venture capital firms that largely--led investment in the Valley.
Russ Roberts: They weren't making any money. They [Google?--Econlib Ed.] had a search engine that was fabulous. They kept gaining market share. But they hadn't monetized anything. And it wasn't obvious that they could. And I think--carry on. Your story is exactly right.
Shoshana Zuboff: That's true. Don't forget that Amazon wasn't turning a profit for quite a while. But, you know--ultimately it became an extremely successful business.
Russ Roberts: [?] which helps. At this point I don't think they had much of any revenue.
Shoshana Zuboff: Well, they didn't. But they had very substantial plans on the table. There were different models being considered. There were, you know--there was an intense and creative discussion about how these problems were going to be tackled. So, it's not like they were just sitting back and twiddling their thumbs--you know, happy to be doing [surgeon?] and nothing else. There were a range of alternatives on the table. But they didn't have time to explore those alternatives. Because when the dot com bubble burst, the pressure rose. And, even though it was widely understood that Google had the best search engine and many people considered it to have, you know, the smartest founders, even under those conditions, its very sophisticated ventures, backers, were threatening to pull out. So, here was a situation where the founders had publicly rejected advertising. They regarded advertising as something that would disfigure search, disfigure the Internet in general. But now, in the heat of financial emergency, they essentially declared a state of exception. And, again, without going into a ton of detail, people in the company knew that there were collateral streams of behavioral data that were being produced when people searched and browsed. Data that was not being used for the improvement of the search engine or the creation of new services like spell-checking or translation or so forth. These data were sitting around, unorganized, on servers. These data were referred to as digital exhaust or data exhaust. A few people had been experimenting with the data and began to understand that it had a lot of predictive value in it. Anyway. Long story short. State of exception, induced the founders to now turn to these abandoned data logs. To mine them for their predictive signals. To compute them with their advanced computational capabilities, that even back then they referred to as their 'AI'. And to come up with the first major prediction product, which was the click-through rate. And, that became the new model for the ad market. Until then the model was pretty continuous with the way advertising had been in the past, in the sense that advertisers were still picking where their ads went. And so, in that choice there was the continuity of still trying to align where your ad appears with the brand values of the company whose ad it is. Okay. So, now all that changes; and Google says, 'We've got a black box; we're not going to let you see inside the black box. But if you buy the product that comes out of the black box you're going to make more money, and so are we.' That was extremely successful. So successful, by the way, that between 2000 and 2004 with their IPO [Initial Public Offering] documents going public, the first time we got to learn exactly what the impact of this new logic was. And the impact was a revenue increase of 3,590%, just during those years 2000-2004. All right. They quickly understood that these leftover data--these behavioral surplus--were available in all kinds of places hidden in the online environment. And, this was a very fertile time for patents coming out of Google. And, when you read some of those patents--and I discuss some of them in the book--it's very clear, you know, what the data scientists are saying, what they are celebrating. First of all, they are celebrating the fact that they can now hunt and capture behavioral surplus all over the Web. And they can use that surplus to learn things about "users" that folks did not intend to disclose. They can also use those surplus data to aggregate and create inferences that give them insights into people that people did not intend to disclose. So, that's Number 1. Number 2 was, they celebrated the fact that they could do this while with methods that were undetectable. Undetectable. Because they understood that if users knew what they were doing, there would be resistance. And resistance means friction. And friction slows this whole thing down. So, from the start, there was an intentional, explicit strategy of making sure that these methods were backstage methods: were indecipherable and undetectable. In other words, designed to keep people in ignorance of what was really going on backstage. That's why the Google Gmail example is so interesting, because the backstage operation broke through quite quickly there. Whereas before that, they had managed to keep this stuff very, very hidden. Okay. So, right from the start, we are beginning here with the social relations of the one-way mirror: They can see us; we can't see them. They can see us; we can't see them seeing us. And so forth. All right. I'm putting this out there now because it's going to be very important to come back to this in a minute. So, I want to establish this. It's something that was baked into the cake--
Russ Roberts: Yeah, I agree--
Shoshana Zuboff: right from the beginning. Okay. So, now, we talked about the fact that these prediction products, beginning with the click-through rate, are sold into markets that trade in human futures. I'd like to talk about this for a moment. Because, you know, one way to look at what I've done in this book, especially in Parts I and II, is to reverse-engineer the competitive dynamics of these markets. If you've got business who are competing in selling the human future, what is the nature of that competition, and what kinds of imperatives emerge from those competitive dynamics? So, you've got to think about this, this way: What are these businesses selling? They are selling certainty. They are not selling certainty about, you know, oil futures or pork bellies or whatever. They are selling certainty about future human behavior. They are trying to get as close as possible to being able to guarantee outcomes to their business customers.
Russ Roberts: Isn't that--but my question is: On the surface that's good for me. So, there's--again, I'm sympathetic to your view. But, you could argue: This is no different than what's happened to advertising in the past. So, somebody had a genius idea to advertise beer, not, say, during daytime soap operas, but during sports events. Now, that's because someone had the insight that beer drinkers are--people who watch sporting events are more likely to be beer drinkers than people who watch daytime soap operas. That's okay! There's nothing bad about that. In particular, there's something good about it. I'm watching the game and I see something I actually care about. And similarly, this idea--I think the confidence in Silicon Valley of their ability to predict human behavior is way overstated, but let's give them that assumption that it's true. The fact that they know what I want--that should be, on the surface, a feature and not a bug. The fact that my--that the advertisers who send me stuff are people that I want to buy--it's better than sending me ads I'm not interested in. So, I don't see why there's anything alarming about this--about that. Now, I think there are alarming things. In particular, I think your insight that the product is no longer what I'm consuming--I'm consuming the search engine or the Gmail. But the profit is elsewhere. The real consumer of those for profit reasons is the advertiser. And that creates a gap, obviously. And the normal feedback loops of markets aren't there. So, that's disturbing. And yet, no one's actually disturbed except professors right now, in the commercial part of it, as far as I can tell. They are concerned about the political part, big time. But in the fact that these searches and my online activity is profitable for Google because they can sell me, they can sell someone else, access to me, doesn't seem to harm me. And that's what I'm pushing you on. Where's the harm?
Shoshana Zuboff: Well, you're pushing me, but you are not letting me finish my points. So, I want to lay out economic imperatives, because that gives me the framework for responding to this--your complaint that this is no big deal. And just as an aside, I would say: The very thought of comparing this to advertising earlier in the 20th century or advertising in the 19th century, you know--that's just silly. Because, yeah, of course. There is no argument here, Russ, that would say, that, you know, persuasion is something new to the human experience. Persuasion is as old as humanity. There's, there's no, there's nothing new about people wanting to persuade one another to do things they'd like them to do. Of course. Right? The situation here is utterly different, because the situation here revolves around companies that have amassed enormous amounts of capital. And these capital fund a digital architecture that has become increasingly ubiquitous and increasingly global. And this digital architecture is the basis for these huge asymmetries of data, information, knowledge, and ultimately the power that accrues to that knowledge. These are not just hidden persuaders. These are hidden persuaders with billions--billions of, of, of developers--deep capital resources who are funding and controlling a digital architecture and the data that flows through it. To an extent that it is not, now, an exaggeration to say that the Internet is owned and operated by surveillance capital. So, you know, I just don't think that the analogies to historical advertising work here at all. We need to be able to recognize discontinuity when it's real. And here, historically, materially, there is a profound discontinuity. And that's, that's where the concern begins. And, by the way: this is not an argument about digital technology. My argument is not about digital technology. And I know you read my book and so I know you appreciate that. Because, my argument is just the opposite. We entered the digital era, uh, looking for the resources, the support, the information, the individualization in the economic world in our roles, certainly as customers. And to a certain extent, you know, just in our roles, our everyday life just trying to live our everyday lives effectively. Which has become extremely difficult over the past decades. As employment has become more competitive, and wages have stagnated or diminished. And families are under great pressure. Students are under great pressure. So, you know, we went to the Internet looking for these resources that would be the, that would be the counterpoint--in the sense the antidote to the institutional pressures that we feel throughout our lives. And, we deserve that. We deserve digital technology. We deserve the individualization, the personalization. We deserve the data that allows us to live more effectively. We deserve the connection. We deserve the voice. All of that. And indeed, at a societal level, we deserve the "the big data" that allows us to understand the patterns of disease and more quickly and effectively tackle chronic problems, like cancer and diabetes. Chronic social problems. Climate, catastrophe. You know, the digital--the promise of the digital is something that we all deserve. My argument is that we deserve it without the strings that are attached, that begin with privacy, but go far beyond privacy. So, that's where the economic imperatives come in. And I'm going to try to do this quite quickly. Simply to say, where I was before, we look at the idea of markets in human futures, and the competitive dynamics of those markets, actually produce a set of predictable and now deeply institutionalized economic imperatives. Okay. The first one is pretty obvious. We know if we going to, if we are going to feed machine intelligence with data in order to come up with predictions, we need a lot of data. So, the first imperative has to do with extracting behavior surplus at scale: We need economies of scale. You, you, you may remember that just last year, there was a Facebook memo that was leaked. And in that Facebook memo we got a little bit of a window into the production process--the new means of production at Facebook. And that's their AI [Artificial Intelligence] Hub. And, among other things, in that memo--by the time I saw this document, I was--my book was complete. I think at the time I was finishing revisions and just in the concluding chapter. So, all of the things I've said to you were already, you know, conceptualized, written about, and so forth. But, here in this document, they describe: What do we do in the AI Hub? And, what they say is: We produce--"predictions of user behavior." That's what they produce. And it talks about the trillions of data points that are ingested by the AI on a daily basis. And the AI's capacity to now produce 6 million predictions of behavior, per second. Six million predictions per second. So, when we are talking about economies of scale, this is a very serious business, to be able to, you know, have these flows on conveyor belts of trillions of data points and to be able to produce 6 million per second. This is serious economies of scale. Okay. In the second phase of competition, the insight was: Scale was essential but it's not enough. We also need scope. We need varieties of data. In order to get varieties of data, we have to leave behind this online environment that we've been talking about, Russ. We need to get people out into the world. And we need to follow them into the world. We need to know where they are and where they are going. Who they are going with. What are they buying? What are they eating? What are they doing? Where are they driving? What are they doing in their car? We need to know as much as we can about the different environments in which they are operating--their homes, their automobiles, the city streets. So forth. So, this is now scope. Also, the more we can know about how they feel, the better we can predict their future behavior. So, 'We want their faces, because that gives us the ability to analyze the thousand muscles in the face that actually produce very accurate, affective predictions.' And we want to be able to see people--not just who is on the street, but how are they--what's the angle of their shoulders? Their posture? Their gait? All of these things become crucial sources of behavioral surplus. So, we give people little computers to carry in their pockets to take with them out into the world. The apps on those computers are constantly streaming--and again, hidden, behind the one-way mirror. You download a diabetes app, it grabs your contacts. Some of them grab your cameras; some of them grab your microphone; some of them grab your location. All of them streaming those data to third parties, and most of those third parties are owned by Google and Facebook. But, on to third parties, and more third parties, and so forth. All right. So, now we have your little pocket computer. And that is combined with sensors and cameras that are increasingly saturating public spaces, saturating our homes, our cars. These are the economies of scope. But in a third phase of competitive competition, of competitive dynamics. There's a new insight. And that insight is: The most predictive behavioral data, the most predictive signals, are achieved, when we can actually intervene in the state of play. Actually begin to actively, now, tune and herd human behavior in the direction of those guaranteed outcomes that we seek. So, this is new. Because, anybody in the business world has heard of economies of scale. Heard of economies of scope. But this is something new. And it mirrors what the data scientists describe in the world of the internet of things as the shift from monitoring to actuation. So the idea becomes that the technological architecture isn't just producing data flows about what's going on, but it's actually enabling feedback loops. So that we can affect what's going on: We can not only monitor, but we can effect it. So that's a shift from monitoring to actuation. And that's what we're seeing here in these competitive dynamics. So, this is what I call economies of action.
Russ Roberts: But what's wrong with all this? I--those are all great descriptions of how the online world has become, uh, profit streams for different aspects of our lives. And I'm 64 years old. I find it somewhat either unnerving or disturbing. I can imagine consequences of this that are very negative. But, right now, they aren't. Most people, I would argue, think this is great. And if I don't like it, I can turn of 'Location' on my phone--which I often do, because I'm 64 and I don't like it. I don't have Alexa in my house--for that reason: I don't like that it's listening. I'm not sure why I should care, but I don't like it so I don't do it. If I wanted to, I could not use a lot of these things and get by without them. Now, I would argue, I think that most of the people, again, listening, love these things. Not only do they not find them concerning : They like them. So, I'd like you to try to argue why they shouldn't. And, in particular, if you don't like them and if you think they are bad, you're going to have to come up with another way to get those things you think we deserve. Because somebody's got to pay for them. They are not free. Now, you are arguing essentially we've made a deal with the devil. And you might be right. And I'm worried about it. But, it's not obvious, I think, to most listeners that that's the case. So, try, to convince them.
Shoshana Zuboff: Well, as I've said, and--so, I'm in the middle of describing economies with action and how these are achieved. Because, economies of action are something new under the sun. This is, again, this is not the persuasion of old. This is persuasion, now, that has to be effected, actuated, through the medium of a digital architecture. So, this is a new kind of problem, a new kind of challenge. And this required some experimental work. And, I have to hypothesize that most of the experimental work has happened consistently hidden from the public. But, there's some of the experimental work that comes through to the public view. So, let's talk about that for a moment. One prominent domain of this experimental work was at Facebook; and we got wind of this in 2012 when Facebook published what it called its 'massive-scale contagion experiment.' So 'massive scale' means at the scale of populations--contagion experiment. And what they did in 2012 was to see if they could use subliminal cues on Facebook pages to affect real-world behavior--in this case, getting more folks to go vote in the mid-term elections. When the news broke about that, again, there was a wave of outrage around the world. And Facebook went into its usual process--
Russ Roberts: 'Mea culpa'--
Shoshana Zuboff: making apologies, and so on and so forth. As they were doing that, the ink was drying on a second massive scale contagion experiment. This one was designed--again, these subliminal cues on their pages to see if they can make people feel happier or sadder. All right. In both cases, these studies were published in very prestigious scholarly journals; and when they were published, the researchers celebrated two facts. And this is where I want our listeners to remember that the patents from the early 21st century that I described before, and the one-way mirror. So, now we're in 2012, 2013, and what the researchers celebrated in both of these scholarly articles were two facts. Number 1: We now know that we can use subliminal cues in the online environment to effect, to actuate, real world behavior and feeling. Emotion. That was Number 1. Number 2: We now know that we can do that in ways that completely bypass user awareness. Undetectable. Methods that are undetectable. Okay. So, this is experiment in economies of action that are happening, hiding in plain sight, in Facebook. Now, a few years later, we come to understand that Google--the other pioneer of surveillance capitalism--because, by the way, Google invented surveillance capitalism, but the first company that it migrated to, [?] now, was Facebook. It quickly became the default economic model for the tech sector, but by now is spreading across the normal economy. So, we're seeing this economic logic in insurance and retail and health and education, finance--across many sectors--coming full circle now back to production. We can talk about that later if we need to. All right. So, now we see--in Google--that there's also, then, several years of experimenting how to achieve economies of action. Google chose to bring its experimentation to the world through an augmented reality game called Pokemon Go. Pokemon Go was incubated in Google for many years. It was developed by a man named John Hanke, who, before that had been the boss of Street View; before that Google Earth. He was the invention of the satellite system keyhole which became the basis for Google Earth, when Google bought from the CIA [Central Intelligence Agency] and Hanke came with it to Google, he was someone who had a long history of rejecting the claims of folks in towns and cities who didn't like Street View cars coming through their neighborhoods and capturing their houses and their neighborhoods for Google Maps. Okay. So, this is John Hanke, many years for his laboratory inside Google, called Niantic Labs. Developing this augmented reality game. And then, at the very last minute, spun off from Google, brought to market as if Niantic Labs was an independent company--of course, it's primary investor remained Google. We learned about Pokemon Go--most people know Pokemon Go as a huge worldwide success. A lot of folks went out and looked for Pokemon creatures. Families did it, you know, getting out through the city, in the parks, on the streets, and so forth. What we learned eventually about Pokemon Go was that Pokemon Go was an experiment in economies of action. In this case, Niantic Labs was grossing its own futures markets. It had established--well-known establishments--institutions from McDonald's to Starbucks--but also, Joe's Pizza and the tire place in town. These establishments were paying fees to Niantic Labs. In the--analogous to online advertisers. But, in this case, they are not asking for click through rates. Which would be the online equivalent. In the real-world now, with real life, real bodies, they are asking for footfall. They want the actual bodies with the real feet on their floors, in their establishments. And, what the game did, was it used the incentives intrinsic to the game to learn how to herd people through the city, to the establishments that were paying Niantic Labs for footfall--guaranteed outcomes. Now, none of those was known to the public when it was happening. It was--surfaced later, and came out in an FT [Financial Times?] article, initially, where Hanke was interviewed. And, even today, the vast majority of people have no idea that Pokemon Go was monetized in this way. Okay. So, here we have a basis now for this highly capitalized experimental work, learning how to shift, shape, modify, tune, herd, direct human behavior in ways that are designed to be undetectable. Bypassing awareness. 'Bypassing awareness'--what does that mean? Well, in psychology, psychologists talk about the fact that awareness is essential for what psychologists call self-determination. So, you can't be a self-determining individual, an autonomous individual, without having awareness of your situation. And of your own action. So, here we now have surveillance capital intervening in human behavior, at scale, learning how to shape human behavior at scale, in ways that are undetectable. Which--and which are designed to bypass awareness. And, you know: Right to say that people don't realize this, and maybe at the moment do not care. But a lot of the 'do not care' phenomenon is explained by the fact that these operations are specifically designed to bypass our awareness. To keep us in ignorance. While entertaining us with the game. Heh, heh, heh. Entertaining us with the Pokemon creatures, and so forth. Entertaining us on our Facebook pages. But indeed, there are systematic highly capitalized operations that are a direct response to economic imperatives. This is not James Bond, with some, you know, Evil Empire cooking up this stuff. These are people who are now committed to an economic logic that requires these kinds of practices. You mentioned Waze an moment ago, and that it had, that it enhances your quality of life. And I'm sure it does. But it's also important for folks to know that Waze is a Google application that is part of a larger--a larger vision of a smart city, what Google used to call a Google City. And how a smart city can function. And the Waze Application, right now, has also recently clarified that they, too, have established their own futures markets. So, they've got McDonald's and other establishments that are paying them to send folks their way. And these dynamics are not transparent. They are not disclosed to drivers.
Russ Roberts: But, when I see the ad for McDonald's, I kind of figure. You know. I kind of get it.
Shoshana Zuboff: Waze is priding itself on its ability to gather data about drivers that go far beyond, you know, where you are on the highway right now in your commute.
Russ Roberts: Sure. They have ambitious goals, like you talk about. Many of them are going to be good for human beings, and some of them probably are going to be good for Waze and Google and not so good for human beings. You know, when we think about--
Shoshana Zuboff: Right. So, let's--
Russ Roberts: Well, I want to--
Shoshana Zuboff: So let's go back to your fundamental challenge to me, which is: Why is this a cause for concern? So, the causes for concern here: First of all, you see this, you see this progression from the subliminal cues, the remote herding, now, to the actual application in something like Waze. This is--these are the mechanisms and methodologies that accompany, like, Alphabet/Google/Sidewalk Labs. This is intrinsic to their vision of a smart city. The idea that now, these computational conclusions, these computational analyses replace the frictionful, messy, often-conflictful back-and-forth of municipal governance. And that, you know, we run cities in this move[?] frictionless way by having these immense data flows, the trillions of data points, the millions of predictions per second. And we compute; and we can herd populations through the cities, through the towns. We can tune and shape and modify in ways that maximize the outcomes that we see. And, of course, the problem is that this is being prosecuted under the aegis of private capital--specifically private surveillance capital. There is no democracy here. There is no self-determination here. There is no shared citizen, solidarity, governance, democratic power here. This is a completely different kind of future, and a different kind of solution for our future, that is profoundly anti-democratic. So, let me come back to the concerns. 'What are your concerns, Shoshana? Why are you concerned?' When we put all the pieces together here, I've said to you earlier: This is not soldiers in jackboots coming to tear you from your bed in the dark of night. This is a different kind of power. And it's a power that is profoundly anti-democratic. And it erodes democracy from below and from above. It erodes democracy from below because its own economic imperatives produce a requirement for economies of action: Behavior modification of populations. At scale. All of it[?] mediated by digital architecture. Which is now hijacked by this economic logic. And this digital architecture is the means through which power operates remotely. But this is a direct assault on human autonomy, on human agency. And it wasn't that long ago that within our own government, there was a clear understanding that these kinds of methods and mechanisms, by impinging on human autonomy, violate individual sovereignty. Our threat to freedom. And actually, all of our thinking about the fabric of a democratic society and what is required for democracy--all of our thinking turns on the idea that we have people who are, who have fundamental agency. Who can be self-determining. But now we are looking at global architectures that are aimed at those human capabilities. Okay.
Russ Roberts: Well, let me--hang on--
Shoshana Zuboff: How does it--
Russ Roberts: Shoshana, hang on one sec.
Shoshana Zuboff: Sure.
Russ Roberts: I want to take this in a--we don't have a lot of time left, and I want to make sure we talk about something we haven't talked about. Which is: It's all interesting. Could be true. And, as I said, I'm somewhat worried about. I will observe that the democratic process that currently runs our cities is not terribly successful. I think that's worth mentioning. But I certainly don't want to replace it with a corporate-run alternative, without competition. Certainly one of the challenges of these, of the profit motive in the realm we are talking about, is that competition is what usually protects consumers from the rapacious aspect of the profit motive. And without that competition we are vulnerable. And I think there is a serious concern there in these areas that's real. But, now the question, the harder question, is: Let's say you are right. Now what? What do you want to do about it? Do you have an idea that--other than the fact that you don't like that the profit motive--and I agree, it is consuming; it is out there; it is encouraging the monetization of all kinds of things; it's the way of the world, in most, in modernity. Should we stop it? Should we create a foundation that runs our search engine? Should we create a utility that is run by government that oversees these organizations? Should we break them up? So, or should we just have people who write books and have podcasts who encourage people to look for alternatives that are less disturbing? Which would be another that we've solved these problems historically. What are your thoughts?
Shoshana Zuboff: Well, look. The big picture here--and, the other angle through which to really understand the threats to democracy is that we're about the enter the 3rd decade of the 21st century. And, our expectation was that in this digital future we would be enjoying the democratization of knowledge, and all the emancipatory potential of the digital. In fact--and this relates to your theme of competition--in fact, we are entering this 3rd decade with a social pattern marked by asymmetries of knowledge and power that actually harken back to a pre-modern kind of societal pattern. A pre-Gutenberg societal pattern. Now, we have a very strange situation on our hands. Nearly all of the world's information has shifted from analog to digital. And yet we only have a handful of institutions. And, in that very short list, they are all privately owned surveillance capitalists. A handful of institutions who are even capable of computing the vast amounts of data that exist. So, when we talk about, you know, the trillions of data points per day, and the six million predictions per second, we are talking about asymmetries of knowledge that are intolerable for a democracy. Okay. So, are we going to fix this by breaking the companies up? Are we going to fix this by imposing privacy law? Are we going to fix this by creating government-run utilities? These are all amazingly important questions. My view is this: People say, 'Oh, gosh, you know, I learn about this and I feel so depressed and I feel helpless and I feel resigned.' And, you know, 'How are we ever going to fix this?' I feel differently. I feel extremely optimistic about our situation. And the reason is that we haven't even tried to fix it yet. Surveillance capitalism is 19 years old. During those 19 years it has essentially been unimpeded by law. It has had a free run. I have a section in Chapter 11 of my book where I ask the question: How do they get away with it? And I answer it with 16 reasons that are analyzed in depth in the book. The point is that, key among those reasons, is that these operations have been so unprecedented, so hidden--our ignorance has been so comprehensive--that we haven't created the kind of law or regulatory frameworks that would tame these operations. And, temper these destructive aspects of these operations--you know, temper this capitalism to the real demands of flourishing democratic society. So, if we were going to talk about law, what kind of law would we talk about? Well, privacy law is incredibly important here. And privacy law, people begin with principles of data ownership. Data portability. Data extensibility. The problem here is that while we may get ownership and access to the kinds of data that we give these platforms, we are not going to get access or ownership to the kinds of data that they produce within their production processes. We're not getting access to those trillions of data points. We're not getting access to those 6 million predictions per second. So, privacy law is a [?] that doesn't take us far enough. Antitrust Law. There are many, many grounds on which surveillance capitalists are also ruthless capitalists. And there are serious problems of monopoly, and anti-competitive behavior. And we need to, we need to get serious about these dynamics. And there is a lot more discussion today, as you know, Russ, about, sort awakening the sleeping giant of antitrust law. In that pointed at the tech sector. My concern is that, to a certain extent, Antitrust Law is designed to respond to the harms of that we countered in the late 19th and the 20th century. And not respond to the new and unprecedented mechanisms and methods that we've been discussing, associated with surveillance capitalism.
Russ Roberts: Yeah, I agree. I don't think it's designed for it, and I don't think it gets to the heart of the problem. Which is, to me it's the property rights problem. But there's obviously different ways of looking at it.
Shoshana Zuboff: Yeah. So, you know--and what we don't want to do is break up--for example, break of Facebook, break up of Google, and end up with 4 or 5 smaller surveillance capitalists which will simply create more competitive opportunity in the field. So, increasing competition through more surveillance capitalism--isn't going to solve the problems that we've been talking about. So, here--
Russ Roberts: Well, it might, if--it's possible--which is, to take an observation from Arnold Kling: You have a lot of smart friends. You probably have more than I do. I have a couple. But, you have a bunch. You've been at Birkman[?]. You have a lot of talented people there, in Silicon Valley, who are uneasy about the state of things. Why won't they start a Google or a Facebook that doesn't have these characteristics from the beginning? As, like DuckDuckGo at least claims to do. And collect all the data for good reasons; make them public; don't hide behind the black box. So, get donations rather than profits. Or get users to pay fees rather than monetizing their behavioral surplus. Wouldn't that work?
Shoshana Zuboff: Well, this is what I’m saying, Russ. That, if breaking up companies but not challenging with law and alternative regulatory frameworks, not challenging the fundamental mechanisms and methods I've been describing, then we leave the field open for a more intensified competition, among surveillance capitalists. And new surveillance capitalists' entrance. Because we haven't confronted these mechanisms and methods. So, the next step in my reasoning is that, beginning to think freshly for the 21st century for these unprecedented conditions about what law and regulation might look like, that then opens the space for the competitive solutions that we desperately need. So, let me give you an example of what I'm talking about. I describe the sequence that begins with claiming private human experience as free will material and ends with the dynamics of human futures markets. I think there are opportunities to intervene in the front end and in the back end that would make a substantial difference, and open up the field for the kinds of competitors who want to, sort of, re-, uh, re-direct our trajectory toward the digital future. In a way that produces the, the good outcomes that we see without the costs that we've been discussing. So, for example, if we, if we got real about saying that, uh, 'You are not allowed to take my experience,'--for example, right, you know, you may know that a couple of years ago there was a multi-stake holder process that was hosted by the Commerce Department. You had the NGOs [Non-Governmental Agencies], you had the companies, you had the government, kind of agree on facial recognition. And the talks were bound, because the companies insist that they should be allowed to have cameras and sensors on the streets that can take what they want, translate it into their facial recognition software, and have our faces. They insist that they have that right. And the government did not fight them on that. So, I want to say that is fundamentally incorrect. That, the companies have no right to my face. And that I have a right to walk on the street without my face being taken without my knowledge--certainly without my permission. And used in whatever way they choose. So, this is, right at the beginning of this process, that we say, 'No, you can't simply take people's experience, and you can't do it in a way that is hidden and deprives them of decision rights.' Okay. At the back end, we can say, that we outlaw markets that trade exclusively in human futures. Why not? Because everything that I have described to you arises from a competitive dynamics of these markets. So, when we say we do not allow markets to trade in slavery, we say we outlaw markets that trade in human organs. Why not say, we outlaw trade in human futures? Because these markets--and you said something really important before, Russ, and we didn't really come back to it--but, these markets, these are not the markets that Schumpeter--when Schumpeter talked about creative destruction, which, as you know has become a sort of [?] for all of this activity--he talked about creative destruction as a small and tragic consequence of the [?] creative process. Creative destruction was the unfortunate consequence of what he called the creative response. And the creative response was supposed to be an economic mutation that really moved the dial of economic history. And his standards for that were very clear. His standards for that were that you can tell an economic mutation from just another innovation because it's such a profound breakthrough that it really benefits all of society. It lifts all boats. It raises the standard of living for the great majority of people. This is not how surveillance capitalism operates. Its profits circulate in a very narrow domain of the companies, their shareholders, their business customers who operate and gain value from these futures markets. But, these are not profits that circulate back into the economy. These are not profits that [?] the middle class or that help us fund our public education system, or anything else. These are markets where the revenues are essentially parasitic, because they are based on taking raw material from us without asking.
Russ Roberts: Well, we do get something in return--
Shoshana Zuboff: We do. We do get something in return--
Russ Roberts: And that return goes to every single, almost every single person--rich and poor. They all watch YouTube. They all are using Waze. They are all on email. They are all using the Internet in their pocket. Everybody's got a smart phone, rich and poor: almost everyone. It's kind of extraordinary. So it's a complicated thing. And I agree with you that we need some new ways of thinking about it, and--
Shoshana Zuboff: But this is--what you are saying, Russ, is by design. That's the whole point. It's by design. The whole idea of free was by design, in order to establish invariable, dependable supply chains--
Russ Roberts: yep--
Shoshana Zuboff: of behavioral surplus. So, you know, for example, Android. When they developed Android at Google there was a bunch of people at Google who said, 'Great. Now we finally have something, we can sell it with a hefty margin and we can finally compete with Apple.' But other minds prevailed. And those minds said, 'No, no, no. Just the opposite. If we can give this away, let's give it away. Because this is going to be our most powerful supply chain interface. This is going to be the way,' you know, 'we'll claim that it's the mobility revolution, and this is going to be the way that we stream data from all over the place. This is going to free us from the desktop.'
Russ Roberts: Yeah. But if we didn't like all that streaming, we wouldn't use the free phone. That's the only point I want to make--
Shoshana Zuboff: But that's not true, Russ--
Russ Roberts: No?
Shoshana Zuboff: because we don't know about it. This is the fundamental issue. You know, I read about this research that was published in the American Journal of Medicine where they investigated a bunch of health-related applications--specifically in this case, diabetes applications. That are approved by the FDA [Food and Drug Administration], because now the FDA actually approves certain applications. And they discovered--and this requires forensic analysis--people don't know this because it's designed so that they can't know it--every single diabetes application that they reviewed was first of all, streaming data to third parties that had nothing to do with the health domain. And again, many of those domains, the majority of those domains, owned by Google and Facebook. But they are also doing other things, the second you just download an application, a diabetes application. They are doing things like taking your contact list; in some cases then they use the contact list to contact those of your contacts, and they take those contact lists. Many of them commandeer the microphone, the camera; learn about other applications on your phone--your messages, your email. This is happening through these innocent--so-called innocent--diabetes apps. No one knows that these things are going on.
Shoshana Zuboff: Right. Let me just--I'm going to have to go. So, but I would like to just end on this one note: that, I appreciate what you're saying, but I don't think the research bears you out. Because, when you look at the research on the users going back to, really, you know, as early as 2003, 2004, 2005, survey research, other kinds of participant research, when people learn about these backstage operations, historically, they are appalled.
Russ Roberts: Yep.
Shoshana Zuboff: They are outraged. And they don't want anything to do with them.
Russ Roberts: Yeah. And Facebook lost a lot of users this year. And maybe they'll lose more. I don't use it. I don't recommend that people use it. I encourage people to do other things. It's a good point. I think we need more of that, probably.
Shoshana Zuboff: So, yeah. But then, you know, folks, despite those reactions, keep using.
Russ Roberts: Yeah. They like it.
Shoshana Zuboff: And so, you know, the companies have pointed to this over the years and said, 'See,' kind of like what you're saying, 'people really like this.' And, 'There's nothing wrong with what we're doing, because people continue to participate.' Again, this is my, part of my 16 Reasons about how they get away with it. But, so this contrast between how people feel and then how they behave has been referred to as the Privacy Paradox. But, in fact, my argument is it's not a paradox. And it's not a paradox because we know what we want. And we know what we reject. But we are living in a world where the alternatives have been systematically foreclosed. So that, I'm in a situation now where I want to get my kids' grades from the school; I want to get my health results from my doctor's office; I want to organize dinner with my family, friends, at a restaurant--just for these basic operations of daily effectiveness, I am required to march through the same channels that are surveillance capitalism's supply chains--
Russ Roberts: Yep--
Shoshana Zuboff: where hidden operations are working on my action. And are scraping my experience for predictive signals. And this is ubiquitous. So, we are increasingly, you know, in this world of no exit. And, from an economic point of view, from a business point of view, from a competitive point of view, you know, it's hard not to see this as some kind of giant market failure. Because, in fact, the disconnect between supply and demand, to me, is a better[?] explanation than to call it a Privacy Paradox. It's not a paradox. It's a disconnect. Because, what people want is not aligned with what's on offer. And so, my view is that if we actually got serious about these regulations that were right[?] on surveillance capitalism, that opens up the space for a new kind of competitor to come into the space, form alliances, create a new ecosystem, that really takes us on a different path to the future. And that really--you know, sort of gets us back to the kind of thing that Schumpeter talked about, which is: What is entailed for a healthy, flourishing capitalism that, such that we can have the concept of a market democracy, and it can make some kind of sense?