Ian Leslie on Being Human in the Age of AI
Jan 9 2023

robot-school-300x300.jpg When OpenAI launched its conversational chatbot this past November, author Ian Leslie was struck by the humanness of the computer's dialogue. Then he realized that he had it exactly backward: In an age that favors the formulaic and generic to the ambiguous, complex, and unexpected, it's no wonder that computers can sound eerily lifelike. Leslie tells EconTalk host Russ Roberts that we should worry less about the lifelike nature of AI and worry more that human beings are being more robotic and predictable. Leslie bolsters his argument with evidence from music and movies. The conversation includes a discussion of the role of education in wearing down the mind's rougher, but more interesting and more authentic, edges as well as how we might strive to be more human in the age of AI.

RELATED EPISODE
Arnold Kling on Twitter, FTX, and ChatGPT
Economist and author Arnold Kling talks with EconTalk host Russ Roberts about the recent drama in the tech world--Elon Musk's acquisition of Twitter, the collapse of FTX, and the appearance of ChatGPT. Underlying topics discussed include the potential for price...
EXPLORE MORE
Related EPISODE
Ian Leslie on Curiosity
Why are some people incurious? Is curiosity a teachable thing? And why, if all knowledge can be googled, is curiosity now the domain of a small elite? Listen as Ian Leslie, author of Curious, talks with EconTalk host Russ Roberts why curiosity is...
EXPLORE MORE
Explore audio transcript, further reading that will help you delve deeper into this week’s episode, and vigorous conversations in the form of our comments section below.

READER COMMENTS

VP
Jan 9 2023 at 12:26pm

As I started listening, I was immediately reminded of the EconTalk episode with Charlan Nemeth, “In Defense of Troublemakers.” If everyone is using ChatGPT, will there be fewer contrarians to provide those different perspectives? Or the alternative, will it give the dissenters an advantage? If we all start sounding the same, AI certainly feeds into our desire to want to fit in with the rest of humanity, but maybe less so in regard to our desire to be unique.

Second point has to do with the rules of grammar and correctness. No doubt we need those rules, but we also need to know why those rules are there. I don’t think there is a better discussion of this than in Joseph Williams’s book on style. Highly recommended.

Finally, this: “English usage is sometimes more than mere taste, judgement, and education–sometimes it’s sheer luck, like getting across the street.” – E.B. White

Postscript: This post was not written using AI. 🙂

[Nice observations, VP! I’ve made your link to the Nemeth podcast episode clickable; and I’ve also added it to our Delve Deeper section. Thank you. –Econlib Ed.]

Dr. Duru
Jan 10 2023 at 11:19am

ChatGPT has a “temperature” parameter (and others I am sure) that adjusts the amount of “randomness” in the generative AI’s results. So, “creativity” can come from the amount of noise you are willing to tolerate from the algorithm. The current version available to the public apparently does not make this temperature parameter available. It will be VERY interesting to see what happens once people in mass start experimenting with the amount of “creativity” in generative AI.

noah bolour
Jan 9 2023 at 2:36pm

I enjoyed this episode greatly and found the general premise thought provoking.

As Mr. Roberts and Mr. Leslie started talking about music, I found myself pacing across the room in thought and disagreement.

First, while it’s definitely true that pop songs have become more harmonically simple, I believe the reasons for this are the exact opposite of what you both put forward, namely that the key change and bridge had themselves become such a formulaic exercise. I believe it has been a reaction against that. Staying on the topic of harmony, I think a case can be made that the chords themselves have become weirder and more unconventional, while the progressions have become simpler.

Second, while Mr Leslie mentioned texture in passing, neither of you brought up the topic of rhythm. Pop music has become much more complex rhyhmically and texturally.  A beautiful example of this is Beyonce’s new album. In the first song, there is a masterfully executed change in the rhythmic emphasis about halfway through the song. It is not only technically impressive but also extremely expressive. It takes you to a different world, the same way a nifty chord change does. We hear things like this throughout pop music these days.

In sum, I think the locus of expression in pop music has changed, rather than the music becoming less expressive.

Anyways, thanks for another great episode.

David Schatsky
Jan 10 2023 at 9:00am

It seems to me that since antiquity creative outputs have generally conformed to “a grammar”: proportions of buildings, designs of columns; musical forms like concertos, symphonies, fugues; painting modes. Teaching those grammars makes good sense. Really good artists can stand out within a form. Occasionally a brilliant artist will break the mold and define a new one. Not sure it’s fair to call education in the grammar of writing or anything else “factory farming.”

Dr. Duru
Jan 10 2023 at 11:28am

I also loved this episode. But (see how I started this sentence =smile=) I still couldn’t help thinking that parts of this conversation sounded like the classic agita of an older generation lamenting the lack of [insert complaint here] of the current generation. I do get your point, and I am also fascinated to hear there is data behind the claim that pop music uses less change in key. However, I am less worried about the subjective elements of the critique of today’s pop music. I seem to recall that each new wave of popular music has been accompanied by some kind of formula. There is always a “sound” that people rush to emulate because it’s the latest thing: the Motown sound, disco, the Prince sound. Sometimes, it is particular music producers who have a sound that becomes popular.

Thanks to this podcast, I am thinking more about how rules hinder creativity and make us easier to predict and analyze. In a small way, I like to think I am playing games with certain ad monetization and digital marketing algorithms by refusing to be predictable. Still, I definitely see how it is possible to create a self-reinforcing loop where an algorithm figures out a key insight that is next used to drive behavior and the increasing predictability of that behavior in turn makes for better and better algorithms. Yikes!

Daniel DeKlotz
Jan 10 2023 at 3:41pm

Thanks for another great episode.

I have to quibble with the music portion, not on a counter-example basis, but generally, the argument that we’re seeing reduced richness in structure and content because bridges and key changes are declining in popularity seems to really be over-indexed on a couple of specific structure elements. It’s almost exactly like saying “kids these days don’t write creatively because they’re ditching our beloved 5-point essay format” (to take an example from the episode). I miss bridges too, but I’d also suggest that there’s a lot of structural creativity on the rise that gets missed by measuring the decline of previously popular structural elements. It would be like discounting the richness of Beethoven’s compositions because he’s not writing canons or fugues like Bach used to. If you really want to capture the changes in structural content of music, you’d also want to include rising trends. To take one example, the 2010’s saw the rise of “the drop”, which wasn’t really a thing before that.

Shalom Freedman
Jan 11 2023 at 2:27am

The main idea of Ian Leslie in this conversation is that the increasing creative power displayed by AI has led the most popular human creators to become more machine-like more AI-like- operating on the instruction of simple algorithms. He focuses primarily on popular Music and Movies and gives evidence that the most successful human creations in this area have become simpler and less emotionally complex and rich.  He does not however do this in an all-inclusive way and recognizes that there still is if somewhat at the fringes first-rate original work being done by human beings

Russ Roberts does not object to this thesis but raises a couple of points which seemed to me to realistically contend with any complacency about the AI-human relationship. He suggests that the situation is not static and that the AI whose creative efforts may seem simplistic now will learn from the more complex and richer creations. As interesting creators know how to surprise, the AI will learn from those surprises to make surprises of their own.

In other words, it seems to me, and I may be wrong is that what Russ is really suggesting is eventually the AI will be able to equal and even surpass human creators. The poetry and song writing, essay-writing film-making and other creative work done now by AI may not yet be at the highest level but eventually it will come. Many of AI creations have after all already reached the level that their consumers felt them created by humans. And this brings us to the question of questions regarding the AI’s making humanity obsolescent and leaving us good reason for discouragement and loss of self-respect.

Here I am afraid that Russ’ answer about our nonetheless somehow being true to ourselves and our own humanity was for me not a truly reassuring one. If there is another kind of ‘thing’ superior to us in so many ways what will happen to our self-respect, our dignity, and our feeling of being meaningful? What some feared the confrontation with an alien civilization superior in power and intelligence would do to us, may be done by these ‘creations’ of our own.

 

Ian
Jan 11 2023 at 6:13am

Fun episode.

I teach in an undergraduate humanities programme. I don’t think that the problem is teaching rules for writing, I think it only comes if we assess rigidly according to those rules: forexample I’ve heard of essays being robo-marked, which sounds truly terrible. But in general, the cohort at large benefit from being given some structural ideas of how to write, and the most creative will rapidly learn to work around them/break them/etc. In modern poker, you learn to play according to game theory, and only then can you truly understand how/where to best deviate in order to exploit unusual players. Rules are an aid to creativity, not a barrier.
Superhero movies are an absolute disaster. I recall in the wake of the financial crash, some actors/directors bemoaning how hard it was becoming to make interesting mid-budget movies. I mostly rolled my eyes at the time, but it turns out they were right.
I think that you mostly talked yourself away from your starting point by the end, to a point I found myself wanting to make: humanity is not simply creativity. When we speak to someone grieving, or sick, for example, I think it’s easy to struggle to reach for some unique expression that no one else will say, but I’ve come to the realisation that what really matters is less what we say and more that we say what we say with true feeling and compassion. I think that point has more generality: in a media/publishing/even academic world where we focus on what’s new and unique, we prize creativity, but we can easily overlook other valuable attributes of execution and feeling and engagement.

AtlasShrugged69
Jan 11 2023 at 10:34am

To get a bit technical with Music Theory, the reason the key change works in Every Breath You Take, is because it takes an originally Major scale (let’s say G) and turns it into G minor.  So the main chord progression in G for most of the song is 1(G) – 6m(Em) – 4(C) – 5(D), and when the Bridge hits, the key changes to G minor (or B flat Major), of which the 4 5 and 6 chords are E flat, F, and Gminor, the last of which gets resolved nicely into major at the end of the bridge to bring us back to the root key.

As Leslie touched upon, the main reason modern songs are less likely to have a bridge, stems from modern methods of composition: When working in a Digital Audio Workstation (Commonly Abbreviated as DAW: Pro-tools, Logic, Fl Studio, etc..) one just keeps adding tracks and copying and pasting in a very linear manner.  Most songs before the year 2000 were likely just a songwriter and their instrument.  So, if you’re strumming the same chords for the verse, then chorus, then verse, then chorus again, you need something to break up the monotony.  Enter: the Bridge.  Today you can just add a cornucopia of bells and whistles (literally) to break up the repetitive nature of the standard song format.  To take the contra-example, try playing any modern pop song on just a guitar or piano – I guarantee it’ll sound a lot less catchy without the mastering, drum fills, sweeps, etc…

Just for fun I went and looked at the most played songs on YouTube for all time. They are enjoyable, catchy, but ultimately forgettable – and I think they prove Leslie’s point: Scrolling through that list, the songs that I consider truly great, or a particular favorite of mine, maybe 2 out of 100? The songs I turn to when I am going through a tough time, or need inspiration to hit a new PR on Bench Press, are songs which don’t get 1 billion plays, and certainly would NOT be created by an algorithm whose basis for writing the song is simply ‘maximize appeal broadly’… Music that gives life meaning SHOULD break my heart, bring me to tears, and inspire belief in something extraordinary. Until an AI creates something of that magnitude, I don’t even consider them a worthy adversary. But IF that day comes…

I’ll just have to create something even better 🙂

Daryl Brown
Jan 16 2023 at 1:02am

I agree with Dr Duru’s ‘kids these days’ point, but I’d frame it as a selection bias. Each generation of human beings is, by definition, dominated by unremarkable people creating unimpressive products of work that are quickly forgotten. We thus see all the contemporary mediocrity but only recall the great outliers from the past, giving the impression of decline.

On a separate point, I agree with Ian that rules are important and we must discern if/when to break them. In his fence analogy, G.K. Chesterton did not simply command the fence be left alone. He encouraged the decision maker not to break a fence until after understanding its purpose. Similarly, my music theory professor inspired us to learn the rules, apply them, and then break them judiciously. First understanding rules, then working within and without them, is the way we offer a unique contribution to the world.

Casey c
Jan 30 2023 at 12:26pm

I asked chatgpt to give me the unofficial song of the summer for every year between 1995 and 2019 and if it had a bridge.
fits very well with this podcast
1995-2008 – 100% bridge
2009-2020- 50% bridge

LEAVE A COMMENT

required
required
required, not displayed
required, not displayed
optional
optional

This site uses Akismet to reduce spam. Learn how your comment data is processed.


DELVE DEEPER

Watch this podcast episode on YouTube:

This week's guest:

This week's focus:

Additional ideas and people mentioned in this podcast episode:

A few more readings and background resources:

A few more EconTalk podcast episodes:

More related EconTalk podcast episodes, by Category:


* As an Amazon Associate, Econlib earns from qualifying purchases.


AUDIO TRANSCRIPT
TimePodcast Episode Highlights
0:37

Intro. [Recording date: December 22, 2022.]

Russ Roberts: Today is December 22nd, 2022 and my guest is author Ian Leslie. This is his third appearance on EconTalk. He was last here in May 2022 talking about his book, Curious. Our topic for today is being human in the face of artificial intelligence, based on a provocative essay of yours at your Substack. Ian's Substack is The Ruffian. I heartily recommend it. This essay is called "The Struggle To Be Human." Ian, welcome back to EconTalk.

Ian Leslie: It is great to be here. Thank you, Russ.

1:12

Russ Roberts: So, this piece that you wrote was provoked by some of the responses that people are getting from Chat GPT [Generative Pre-trained Transformer] and how human they seem. And, it's engendered a lot of anxiety about whether humans are going to be replaced, even more thoroughly by computers, artificial intelligence, and so on.

But, you had a very different take. And, your take, I think is the most interesting one I've seen, which is that the real question isn't whether the machines are going to imitate us, but how we are already imitating machines. And, how would you introduce that idea?

Ian Leslie: Yeah, so I was struck and amazed, as so many people were, by just how good ChatGPT's imitations of being human were. It was doing all these wonderful things and I was seeing them being passed around on Twitter; and of course, we're only seeing the really good ones. We don't see the rubbish ones. But even so, that--the really good ones are really good. They're really impressive.

There was one which was--the silliest ones were the ones that appealed to me naturally. So, somebody said, 'How do you get some peanut butter sandwich out of a VCR [videocassette recorder] recorder, but I want it in the style of the King James Bible?' And, he got this beautiful Biblical-sounding passage explaining how to do that.

Another person asked to explain Thomas Schelling's theory of nuclear deterrence, but in a sonnet, if you please. And, they delivered, a ChatGPT delivered this wonderfully formed sonnet, a Shakespearean sonnet, giving a pretty good explanation of Schelling's deterrence theory.

So, many others like this, including some student essays. I saw people, a few academics saying, 'Wow. Posted this. I gave it this prompt and the response I got was as good as some of my students.'

Now, this is where I started to kind of really think about this question, because if you follow some of those academics and then you look further down the thread. They would start to say things like, 'But you know what: I mean, I wouldn't necessarily give a good mark because it sounds a little bit like the response my students give when they don't really know anything about the topic and they're just sort of winging it.'

And, then there was some interesting discussions around that, including one from a guy who teaches writing. And, he said, 'Look, this type of thing is very familiar. The kinds of pieces that GPT writes are the kinds of pieces that students often write and get decent grades for.' And, that's not a coincidence because essentially we've taught them--we taught many of them--that good writing means following a series of rules and that an essay should have five-part structure. So, instead of helping them to understand the importance of structure and the many ways you can approach structure and the subtleties of that question, now, we tend to say, 'Five points.' That's what you want to make in an essay. The student goes, 'Okay, I can follow that rule.' Instead of helping them to understand what it means to really nail or at least give your writing depth and originality and interest, we say, 'Here are the five principles you need to follow. Here's how long a paragraph should be. Here's how your sentence should be. Here's where the prepositions go or don't go.'

And, we're basically programming them. We're giving them very simple programs, simple algorithms to follow.

And, the result is we often get very bland, quite shallow responses back. So, it isn't actually any wonder that ChatGPT can then produce these essays because they're basically kind of following a similar process. That ChatGPT has a huge amount of training data to go on, so it does much more quickly.

And so, we should be alarmed by it, but not because it's on the verge of being a kind of super-intelligent consciousness, but because of the way that we've trained ourselves to write algorithmic essays.

And so, I thought that was a really interesting discussion. And, then the more I thought about it, the more I thought, 'Well, actually the same principle applies to different industries.' I listen to music a lot. I'm sure that a lot of your listeners do. And, you see this debate playing out in music because the streaming services--where a lot of us listen to most of our music--are very algorithmically driven, and they tend to incentivize musicians to create songs and tracks that fit a certain formula because they know that formula works. Right? So, there's effectively a set of rules imposed and you either meet that standard or you don't.

And, I'm simplifying hugely, of course. But, that tends to mean that musicians then write to that algorithm because they know if they don't then they pay a penalty for it. Because complexity and surprise and originality are not necessarily what the algorithm is going to recognize and put to the top of the queue or to put in a playlist.

And so, you get what some people call the robot aesthetic: everyone writing to a formula and whatever the trend is now, it gets absolutely amplified and you go in one direction--kind of hurt.

7:05

Russ Roberts: Yeah. We'll talk about music in some depth because you have a rather extraordinary piece of data in there that blew my mind, and we'll come to that in a minute.

But, I think this idea that school--education--whether it's teaching people how to write or how to respond to various prompts has, I'll call it an industrial flavor to it. It's not designed to encourage creativity or customization. It's, if anything, the opposite. It's designed--as one CEO [Chief Executive Officer] complained to me about his HR [Human Resources] department: They're really interested in rounding off everyone's edges. And, he liked people with sharp, jagged edges because they were the ones who changed the world and changed the company; whereas the pretty good, standardized one-size-fits-all people that get stamped out by an industrial process, while pleasant, are not going to be the game changers.

And, I think when we think about the educational process, your observations about how writing gets taught in most high schools, at least in America, it's very formulaic. There's a topic sentence and there's this and there's that, and there's a bunch of other rules. And, I would always try to get my kids to start sentences with 'and' or 'but,' and I was told that that's against the rules. And, I tried to explain that it is, but when you break the rules in those really kind of trivial ways, the ear gets a certain effect that it doesn't get if it's the same thing over and over again. Or: Don't repeat the same adjective multiple times in the same sentence without realizing that actually that can create a sonorous poetic effect that has an impact on the ear.

So, there's a whole bunch of style rules like that: Never repeat the same adjective twice within a paragraph. Don't start a sentence with a conjunction. Make sure there is topic sentence first. Make sure that a paragraph has four sentences; and dah, dah, dah. Make sure the whole essay is six paragraphs. All those things are--it's factory farming. It's factory writing. It's factory conformity.

And, it's not useless. But I think, just if nothing else, one thing I think listeners can take from this conversation--and I promise you there are much more interesting things to come--but, if nothing else, one thing you can take out of this conversation is how algorithmic our educational system in many parts of the West has become.

And why that is, is not unrelated I think to the incentives that you talk about on Spotify: It's the safe approach. It has a much lower upside. Best of all, it has a very low downside. You're okay. You're above the line. You can write a basic essay.

And, that's good, by the way. I don't want to--basic, bland communication is sometimes very much required. But, for people who have great talent, whose humanity is and creativity has the opportunity to really flourish, you're continually rounding off those edges. And, I think that as a starting point for how we think about artificial intelligence is very important.

Ian Leslie: Yeah, because obviously, this trend--if you like, if that's what it is--has been going on a lot longer than we--so, for at least 100 years, we've been debating how formulaic, how rule-bound education should be and comparing schools to factories and so on.

But, what the advance--this terrifyingly or thrillingly quick advance of AI, however you see it--has done is thrown into sharp relief these questions. It has added an urgency to these questions I think because what is now very evident is that we are making it easier for the machines to imitate us.

We are almost saying, 'Okay, you are coming this way. Well, we'll come. No, don't worry, we'll come towards you. Don't worry. We'll make ourselves easier to model. We'll make what we do easier to be subject to algorithmic imitation. We'll start to imitate you, so that you can imitate us better.'

Now, there's been lots of benefits from this approach.

You say there were good reasons for the mass scale industrialization of education and so on. But, it comes at a price and at some point maybe the bargain--maybe we come out on the losing side of this bargain.

So, I think thinking about how to resist its seemingly unstoppable march, it just seems more urgent. When you see things like ChatGPT, you go, 'Oh, gosh, right. Okay, so that's where we are now.'

Oh, by the way, just quickly on schools: Since I wrote that, I happened to be in my son's school and I saw it on the wall a 'How to be kind'-it was seven rules for how to be kind, and what being kind means. And, it's well meaning, obviously, but again it sort of reduces kindness to an algorithm. You follow these seven points; you're never mean to someone. If you see someone else being mean to someone, you say something. Go and see a teacher if you need to. All very sensible, but at the same time it's just this idea that you can just checkbox off kindness and say, 'Well, here's the algorithm you need to follow and you'll be fine,' just--it doesn't feel like it's completely human.

12:58

Russ Roberts: Yeah. And, I think back to a conversation--I can't remember if this ended up on EconTalk or not--but I was on a panel forever ago with Megan McArdle. I know Megan's been on EconTalk: I just can't remember if this conversation was in EconTalk or the conversation was somewhere else. I think somewhere else. But, what Megan pointed out was that as robots and other things get better at skills that humans do now, humans will succeed by their ability to do things that robots can't do.

So, a human being--it could be that there will someday be a box: you will put your hands into the box and it will give you a manicure. Maybe that box exists now. I don't know. But, many people go to get a manicure because they want the human connection with another person; and therefore, the best manicurists will not be the ones who do the best job on your nails, but they'll be the best ones who listen: the best ones who know how to show that they're paying attention and that treat you as a human being.

Now, a robot can imitate those things. And, we've talked many times in this program, whether that's, quote, "the same thing" or not; whether that robot will have consciousness. I'm a skeptic on that.

But, what your observation with respect to writing points out--and communication--is that it would be wise for people to not focus on being like everybody else, because that will soon have little or no value.

It had some value 25 years ago. It had some value 25 years ago. It had a lot of value 100 years ago. But, 20 years from now, being able to write an unimaginative email will not distinguish you.

And so, that's a bar that you'll need to exceed, either with creativity or other things that are going to go along with it. And, I think it has a lot of implications for how we educate our children and what we think about investing in as human beings going forward.

Ian Leslie: Yeah. It's going to get less and less valuable and more and more perilous to be generic in any way--to be a generic writer, or to be a generic person, a generic thinker. Because, the machines are very good at analyzing--effectively, the kind of average, the generic model of this form of thought, this form of expression, whatever. There will be a much higher premium on cultivating your own distinctive, inimitable voice in whatever form that takes. Whether that could be literally your voice if you're a singer.

If you're a writer, of course, it's very important. I see huge amounts of generic writing out there by competent writers. But, good luck being a generic writer in 5, 10, 20 years. It's going to get much harder. You need to kind of dig deeper and really work out what it is about you: whether it's what you know about that nobody else knows about or what you care about or how you express yourself, whatever it is. Your voice is now really the most valuable thing about you.

Russ Roberts: And, to be honest, that's not so easy. I mean, it's nice for us to say so, but it's hard--

Ian Leslie: It is hard--

Russ Roberts: Almost by definition, most people are going to be generic, so it does raise--most of us are very close to average, so it's--

Ian Leslie: I agree, I agree. And, writing forces you to confront that every day because your first draft is always--mine is anyway--generic. Kind of bland. I kind of write what other people are saying, what other people are thinking; and then it's a process of hacking that away and reinventing, and you'll finally you go, 'Okay, well, that's a little bit more me and a little bit more different.' But, yes, it is hard.

And, I think it's interesting that comparing--we talk about musicians and other artists--because they really struggle with this all the time. Miles Davis, something he said which was, 'Man, sometimes it takes a long time to sound like yourself.' They really confront this struggle head on.

But, I think it's not just artists who are now having to do this. I think lots more of us will have to go, 'Well, hang on. What does it mean to be myself? How can I be more me?'

Russ Roberts: I want to turn to music, but before I do, I want to mention one example that I think really makes your point in a pre-artificial intelligence era: which is that when I was in the classroom full-time, 20 and 30 years ago, I would be asked to write a lot of letters of recommendation often for economics students or other social science majors who are going to go to law school. And, they took my Econ class and they got a good grade and they wanted me to write a letter for them.

And so, I'd ask them--I said, 'I'm happy to do that. Can you give me the essay that you're writing for your applications, so that I can get to know you a little bit?' And they were all the same. And, I don't mean similar. They were all the same. They were all about a terrible loss, a family loss, a personal--a death, almost always a death of--

Ian Leslie: The trauma plot--

Russ Roberts: A trauma. What'd you call it?

Ian Leslie: The trauma plot.

Russ Roberts: Yeah, the trauma plot.

Ian Leslie: You see in movies and TV all the time; and now in real life.

Russ Roberts: And so, what these students would do would be about: they would tell a story of a terrible tragedy and how they overcame it.

And, the second thing they would write about is how, when they were asked about their goals, it was almost always the same: 'I want to end world poverty.' And, sometimes I'd call them in before I wrote the essay--my essay, my recommendation for them; and I'd say--on the save-the-world thing, the end-world-poverty thing--I'd say, 'Is that really what you're planning to do?' 'Oh, no, no. Not at all, but they say that's what helps.'

And, it was obvious to me that there was a book out there--I didn't have that book--but that said, 'When you write about a challenge you face, use death.' And, they were all, as if written by artificial intelligence. They had followed an algorithm, a formulaic approach.

And, the advantage of that--again, I think it's important to mention this--just like we're talking about with ChatGPT, at least in its current formation, formula, and it may become more distinctive going forward--but the advantage of that is: You don't mess up. You stay inside the guardrails and you've got a good chance of getting into law school if you've got other things on your resumé that look good.

When I'd get kids who were struggling--I thought they were going to be challenged to get into the law school that they were aiming for--I'd say, 'Don't write the essay that everybody else is writing. Take a chance. You're going to get thrown into the garbage. There's no harm to taking a chance. You're a long shot. Write a wild essay. Write something credibly personal. Write something that's nothing like anything else they're going to see--because then you'll stand out.'

And, that's very scary and most people can't do it. But, my point is that your world, your observation is a long time coming. It's a very common phenomenon.

Ian Leslie: Oh, yeah. Yeah. Yeah. And, it is a long time coming. But, yeah, as I say: it's been coming down the tracks really, this confrontation I think we're having with what it means to be human, for a long time. But, yeah, I just think that the swiftness with which AI is now able to imitate us is making us go, 'Well, hang on a minute. Are we really giving up too much of ourselves here for short-term benefits?' And, 'Is there a long-term cost--maybe quite a disastrous cost--to making ourselves so easy to replicate.' But, yeah, you see it everywhere. You see the same question playing out across so many different domains.

20:59

Russ Roberts: So, let's turn to music, because this really blew me away. People have known for a long time that a lot of pop songs are three chords: C, F, and G, if you're playing guitar at home. Sometimes, they throw in A minor in there and it does a little more poignance. But, three or four chords are the basis for most successful pop songs. But, you have an observation about the way songs are structured, which is shocking to me: which is about a change in key and the bridge in a song. So, talk about those who aren't musically inclined. Say a little bit about what those are first and then tell us what the observation is, the empirical finding.

Ian Leslie: Okay. So, the overall kind of observation is that pop songs have been getting generally simpler and harmonic-musically simpler; and also, lyrically simpler, by the way. Which is a point I didn't make in the piece, but there's data for that elsewhere. So, they've been getting simpler and more, kind of predictable--if you like--on average, generally. So, there are always lots of exceptions to this and people always point them out when I make this point, but it's true. There seems to be a kind of general flattening and simpling of the structure of pop songs.

Now, there's a few kind of bits of evidence for this, but the ones I write about in the piece, which were sort of interesting to me, are: There's an amazing kind of piece of chart, which shows the decline in key changes in pop songs over the last, sort of, 15, 20 years. So, a key change--if you are in a song, a piece of music, it's your kind of harmonic environment. You kind of feel comfortable. When you move away from the home chord, the one you start with, there's a little tug that makes you want to come back to it. And so, it's difficult to explain, but you're kind of feeling you're in a controlled, kind of harmonic environment.

Now--so you can change the chords without changing the key. When you change the key, suddenly, you move into a slightly different world musically and also, a slightly different world emotionally because often, there's a kind of change within the song in the singer's perspective on what's going on.

So, one of the examples I give in the piece is "Every Breath You Take," by The Police. Now, in the verse--and kind of, I guess, the chorus--you're in the same key. So he's just--he's rolling that riff: doo-doo-doo-doo-do-do-do-do. And, he's going to fairly, yeah.

Russ Roberts: [sound effects, slightly horn-like]--

Ian Leslie: Yeah, and, then he goes, "Oh, can't you see?" And, he's kind of changing the chords there, but it's still within the same key. "You belong to me." And, it's beautiful and it's lovely; but it feels quite self-contained. It's a little kind of emotionally self-possessed. And you sense there's some kind of hidden yearning or anger or kind of pain there.

And, then he switches to a different key in the bridge. The bridge is kind of like the middle section of the pop song. It's neither the verse or the chorus. And, he goes, "Since you've been gone"--sorry, I'm not singing it right. "Since you're gone, I've been lost without a trace. Doo-doo-doo-de-doo-de-de-doo-de--see your face.' And, it's like a sort of outpouring of pain; and he's almost kind of being really, really honest for the first time, and saying, 'It's been really, really awful since you've been gone and that's why I'm feeling like this.'

So, it's a good example, because you're seeing both what the bridge can do--kind of give a slightly different perspective on the story of the song. And, also, the way that musically, the harmonies work with the emotional sentiment of the song. Like, so, he changes key, which puts you in a different harmonic environment, and he changes his kind of emotional approach. He's much more kind of open and direct and passionate at that point. And, then he kind of returns to the calmer verse. And, it's one of the many things that make this a great song.

Russ Roberts: I've always thought of the bridge as sort of like a chance to clear your throat. It's for the speaker of the song to kind of step aside for a moment and maybe deliver some lines.

Ian Leslie: Yeah, change your perspective.

Russ Roberts: Change of perspective.

And then, and we long--because of the bridge--we long to come back to the chorus or the refrain.

But, your data is that it's not just that bridges and changes of key--which is what the bridge represented--are less common. They've basically gone to zero among the most popular songs that are recorded these days.

Ian Leslie: Yeah. There doesn't seem to be many changes of key left. And certainly, the bridge--I haven't actually seen data on that, but I mean, you can read about it, and not just in my piece. The bridge has basically kind of disappeared from modern pop songs.

Now, these two things don't necessarily go together. You can have a key change without a bridge and vice versa. But generally, there's been a decline in both. And, yeah, the key changes are much, much less common than they were, even kind of 15, 20 years ago.

And, I think that has to be at least partly because of this writing to the algorithm: writing simpler songs that get to the point quicker and then stay there. Right? Because that's the other thing: Short songs are becoming little bit shorter and they get to the hook or the chorus much more quickly because they know that everybody's trigger happy and they're about to kind of click onto the next track.

And so, because of that and because then the streaming services' algorithms then respond to that, you get this kind of ratchet effect: people start writing to it. And, you just get this general, kind of, shortening and simplification of song structures.

The other reason for the way that song structures become less complex is that people write using pro tools, or cutting or songwriting software, where all the kind of instrument, the tracks, are led on top of one another. And really, that's how you start to think as a songwriter, you start to think kind of vertically, like, 'What can I layer in here?'

So, you get complexity in terms of the texture kind of layering of the song, but not necessarily the narrative of the song--like, 'Where am I going from here? Well, how am I telling this story?'

And really, all those things--the bridge or the intro in those kind of older songs, key changes within a song--they're all ways of conveying more complex emotions. You know, so that joy is tinged with a little bit of regret or anger comes with a little bit of sweetness.

Another song I talk about a little bit in the piece is "No Reply," by the Beatles--John Lennon song--where it's a pretty miserable song about being ignored and kind of betrayed by his girlfriend. But, then in the bridge, it switches. The mood switches and he says, 'If I were you, I'd realize that I--'--it is a kind of defiance song, but there's also a lot of kind of almost optimism there. It's almost like he's saying to himself, 'Do you know what? I'm pretty good. I'm worth more than this.' And, you get that change of perspective you talked about.

The Beatles are very good at that. So, you see an explosion in harmonic complexity after the Beatles arrive in the scene because other bands start doing the same. But, that's dropped off. And there's less harmonic complexity. And therefore, I think less emotional complexity.

And, the person who talked very well about that I quote in the piece is Sting. He says, 'For me, I'm very sad to see the disciplines of the bridge because for me the bridge is therapy.' I thought that was a great phrase.

Russ Roberts: Well, my favorite example, I'll just have to mention this and then we'll move to the actual data for a second, is in the Cole Porter song, "Every Time We Say Goodbye." The song starts

Every time we say goodbye, I cry a little.
Every time we say goodbye, I die a little.

And, then it goes--and I think is the greatest moment in American popular song:

There's no love song finer, but how strange the change from major to minor
Every time we say goodbye.

So, it's using the idea of changing the key from major to minor, the happiness of being together, the minor, the sadness of parting. But, he uses it in the melody.

Ian Leslie: It's incredible.

Russ Roberts: It's so great.

Ian Leslie: Yeah. Yes.

30:16

Russ Roberts: Anyway, what you show in the piece is that when the Beatles were in their heyday, about 30% of the Top 100 Billboard songs had a key change. And, today, there were many deck[?], many years of the last 10 where it was zero. There was not a single song in the Top 100 with a key change. Right?

Ian Leslie: Right. And, it's interesting because you can see it. It kind of goes up and down a bit over the last 30, 40 years, and you can almost trace what's happening in pop music. So, it goes up after 1963 as the Beatles kind of arrive in the scene and everyone goes, 'Oh, right. So, we could write more interesting pop songs.' Right? And, then it kind of fluctuates a bit. It goes down when punk arrives. Everybody starts to thrashing things and saying, 'Let's get back to basics here.'

And, then actually at the end of the 1990s, the early 2000s is, there's a kind of peak there because Nirvana were writing very complex pop songs, so Radiohead, obviously. But, yeah in the 2000s, it really starts to trail off and go down almost to zero.

And, yeah, I think the cost of that is you're seeing less musical harmonic innovation.

And you're also--and this is the point Sting was making in his interview--you're seeing kind of a narrowing or flattening of emotional effect as well. As he was saying, the reason he said, 'The bridge is therapy,' he meant: These songs, they go round and round and they really catch and you get into them and then they fit into the next one and all they roll into the next track on the playlist. But they don't show me a way out. It's almost like a kind of thought in somebody's head that's going round and round and round and it's causing anxiety without kind of finding relief.

And, the bridge, for him, musically, he said, 'That's what it used to do. It used to give me this different perspective within the song and make me feel this, kind of, there's hope or there's a different point of view going on.'

Russ Roberts: And, you do point out their exceptions, even today, of course.

What you're saying--and let me try to put a positive spin on it; you may not agree with this: So, we finally figured out what people like and we're giving it to them. What they like is simple. They don't like all that complexity. They don't want the key change. They don't want the multiple thoughts in one song idea. They want one thing and it could just be a good beat. It's just good techno pop or it's good hip-hop or it's great to dance to and I don't need all that other stuff. And so, the Top 100 don't have that.

And, yet out in the right-hand tail, there's still plenty of creative people doing creative things because of technology; and distribution, the way it's set up now, there are songs on Spotify that are fabulously creative. It's just they don't get put your algorithm very often and you got to make it hard--it's harder for people to find them.

Ian Leslie: There are. I mean, but if you care about--the quality of our mass culture, you know--yes, there are all endless niches and we can always find what we're looking for. We can find incredible complexity and intelligence and variety out there. It seems a huge amount, more than ever, in terms of just sheer volume. But, if you're looking at the mass and the average and you say, 'Well, the average quality of our popular songs is--well, it's not gone down. It's become simpler and less emotionally complex.' Well, I think that is a legitimate thing to worry about.

I don't want to be someone who is saying, 'Everything is rubbish out there.' It's not true. And, lots of the songs that I'm talking about, the very kind of simple ones that get to the chorus quickly and don't change key are incredibly good songs. Right? They're incredibly catchy songs and brilliantly kind of orchestrated and produced. I don't want to underrate the skill that's involved in doing that. I'm just pointing out that there's been this overall decline in complexity and therefore, emotional complexity as well as musical complexity. And, that reflects what's going on in the way we consume our music and that's related to the machines and the way we kind of respond to them.

So, yeah. The thing is, we can give people what we want, but as you know an economist and as anybody who's worked in marketing or consumer goods knows, you give people what they want until they don't want it anymore. Because, actually people don't know what they want. Suddenly, they find another thing that they want; and so, it can get stale. And, I don't think it's overall good for the music industry to be so formulaic because I think people will find other ways to find the surprise and the originality and the innovation that they ultimately want.

35:16

Russ Roberts: Well, let's go to movies, which was your next observation. You start off by making the point--which I think is again quite profound--that through most of the history of the industry, there was a great deal of uncertainty about what would succeed and what would not succeed. Movies that were thought to be unlikely to make it turned out to be blockbusters. Movies that everyone were sure were being a huge hit, failed and flopped. But, that formula has gotten a little simpler and clearer right now.

Ian Leslie: Yeah, so there's that famous maxim from William Goldman who said, 'Nobody knows anything in Hollywood.' And, that's just not true anymore, which is not necessarily a good thing. The studios essentially know that putting your money behind the next installment of a franchise that viewers are familiar with is a much, much better bet than coming up with a new idea. The studios know that if they bet on intellectual property that already has some sort of reach and is already lodged in the minds of the public, then people find it much easier to get their heads around it when they come to the cinema.

And, the top movies are absolutely dominated, more than ever before, by franchises; and in particular, of course by Marvel and this kind of cinematic universes. Which again, I want to say, to caveat to my more general point, which is many of them are brilliantly made, incredibly skillfully produced, acted, directed--technology is incredible. These are not, you know--it's not rubbish. But it's not doing what cinema did in the 1970s and 1980s, and 1990s, which is: show, I think, a much richer span of human experience, whether that's kind foot[?] falling in love or out of love.

And, Martin Scorsese talked about this a few years ago. He made what I think were pretty mild comments about how, 'Okay, Marvel is fine, but it's not really doing what I want someone to do,' which is--

Russ Roberts: He's a horrible person, Ian.

Ian Leslie: Yeah. Well, he must be a horrible--

Russ Roberts: He doesn't like Spider-Man. He's bad.

Ian Leslie: Yeah, he's bad. He's out of touch and he's looking backwards. And, people made these furious defenses of Marvel. I mean, Marvel and DC, they can look after themselves. They're going to go on being huge jargonal entertainment.

So, I actually think he's clearly right. I mean, the biggest, the best movies--if we look at The Godfather, a blockbuster, very successful and this incredible searching examination of human nature. I just think it's ridiculous to argue that the Avengers movie has that depth of understanding of humanity and that it's communing with us in quite the same way.

Russ Roberts: Yeah. I think I recently mentioned on the program how much I enjoyed Top Gun: Maverick.

Ian Leslie: I loved it. I loved Top Gun: Maverick.

Russ Roberts: Right. So, I watched it on an airplane on my phone, which has got to be the least cinematic way to watch. I mean, it's a movie designed for IMAX [Image Maximum]. And yet, I found it so remarkably entertaining. But that's all it is. It's just entertaining. It's formulaic to the point of absurdity; and it is utterly delightful.

And, I think of these--I'm going to let my inner snob come out for a minute. I'm not into censorship or I don't think we ought to be subsidizing artistic movies or anything. But I do think your point that--to me, great art is about the human heart in conflict with itself, which is a line of Faulkner's. And, that's not what is successful in our popular culture these days. What's popular is escape. And, escape is great. I love it. I consume it myself. Nothing wrong with it. It doesn't test you. It doesn't make you think. It doesn't cause you to examine what you're doing with your life. It's just two hours of escape and it's very pleasant. And, sometimes, that's all you want. But that's all you get. It's a little sad because there are--

Ian Leslie: Exactly--

Russ Roberts: great opportunities out there. And, you could argue that: Well, Hollywood is doing this. There are these incredible mini-series on Netflix, Amazon Prime, at Hulu, etc. I would just point to The Bear on Hulu, which I think is, if it doesn't go beyond the first season, it's a masterpiece. I loved the eight episodes. It's extraordinary.

There is great storytelling still going on that is about the things we're talking about, what it is to be human. But, it is fascinating that by far the most successful, popular entertainments of both music and movies are, there's nothing challenging about it. And, I don't mean challenging in the avant-garde sense that: 'I had no idea what was going on in that movie,' or 'I couldn't figure out that artwork.' I'm just saying they quite make you question. They raise the ideas in your head.

These other movies are just bliss. They're like taking drugs. They're like drinking. Which is sometimes fun. We all know that. Escape is fun. But, I keep thinking of Soma in Brave New World. It's like, 'Here, take this. Go for two hours over here. And, just sit down in your corner with your screen and you'll be okay.' And, that's [?], because there's something sad about that.

Ian Leslie: Yeah, yeah. Yeah, they are, by definition, generic. I mean, they created their own--they have their own genre. They have their own universe. But, everything is within--you are following a very complex, but a set of rules, a complex formula for making a hit movie. And, like you say, sometimes that's what you want. But if that's all you get, I think that leads to a narrowing and thinning of the texture of your life and our lives.

Yeah, on movies: part of--one of the inspirations of my piece was another article, recent article in "The Atlantic" by Derek Thompson, where he talked about it as the Moneyball-ization of Hollywood. Moneyball was about baseball, and actually, he starts off discussing baseball. This discussion--which went over my head a bit because I don't know much about baseball--but, essentially he was arguing that actually in baseball, it's got more boring, I think, with the rise of analytics and to [?] and run tactics and stuff. And that essentially, the same thing: data analytics are now much more influential in Hollywood in terms of working out what we're going to make and also, how we're going to make it, who we're going to cast, and so on.

And, you can see there were huge advantages of that and huge efficiencies to be made. But, at the same time, there is a cost to be paid. And, I think it's fine. We should point that out. There's a cost to be made in terms of unpredictability. The data will always tell you what worked. It actually can't tell you always what will work or what might work.

And so, yeah, there is a cost to be paid in terms of predictability and lack of surprise and lack of formula. And, I think the lack of emotional or spiritual depth.

Russ Roberts: Yeah. Before I forget, I want to put in a plug for William Goldman, who you mentioned earlier. He is the screenwriter of The Princess Bride. He's the screenwriter of Butch Cassidy and the Sundance Kid, and other successful movies. And, he wrote a memoir about that experience called Adventures in the Screen Trade, and--I think that's the name of it. I will put a link up to it. It is a magnificent--if you're interested in movies at all, you will enjoy it tremendously.

I happened to stumble on another book he wrote called, The Season, where he attended, I think every Broadway play or musical that opened for a year in a particular single year of Hollywood--excuse me, of Broadway. And, many of those plays and musicals have been lost in the miasmas of time, but he's still incredibly fun to read. He's such a thoughtful and interesting writer. He has a second, another memoir, which I didn't think I've read, but I just want to recommend him. He writes very thoughtfully on these subjects.

Ian Leslie: Yeah. Adventures in the Screen Trade, I know, is a great book. Yeah. Terrific.

43:54

Russ Roberts: Okay, so here are these two old guys. I'm older than you, but I'm pretty [?]. But, here are these two old guys complaining about popular culture. Your piece is not a tirade against popular culture at all, or a screed. It's a thoughtful observation about how algorithms summarize where we are in a conversation. Algorithms have taken some of the uncertainty out of both the two most popular forms of popular entertainment: music, and movies. And as a result, there's a certain sameness and a certain formulaic aspect to them. And, we could debate--as we kind of hinted at here was whether that's a good thing or bad thing. I mean, I don't want to--I'm a real--I'm a big snob, so I think Top Gun: Maverick is a horrible movie, but I will admit I enjoyed it.

So, is there anything else to say, here? I mean, other than to make the observation that ChatGPT writes a pretty good movie plot. I've seen a few of those on Twitter already. And, if we all get accustomed to watching certain kinds of formulaic movies, human beings won't be very necessary for writing those.

[SPOILER ALERT] In fact, some of the best lines of Top Gun: Maverick--for example, 'Don't give me that look.' 'It's the only look I've got.' Those kind of lines, it's fun. And, to see Tom Cruise deliver it is always entertaining, even if he looks somewhat formulaic himself. He's undergone some algorithms to keep himself looking young and pristine.

Is there anything--you know, so what? So, there'll be fewer human beings writing movies because AI will be able to write the movies. They'll be able to write the dialogue. They'll even be able to figure out ways to surprise us because they'll learn how to do that. And, should we bemoan this really? Unless you're planning to be a screenwriter--there's a lesson here for sure--is there anything else to say?

Ian Leslie: Well, only what I've already said, which is: if you're relaxed about the fact that movies are becoming essentially less human, that there's kind of less of the richness and complexity of the human experience captured in a popular entertainment--if you are okay with that, then that's fine. I'm not so much. I would rather popular entertainment was as rich and satisfying and complex as we know it can be at its very best. And, sometimes still is, but certainly has been over the last sort of 50 years--right?--in music and the movies.

And, I think the kind of inevitability argument, there's always this kind of sense of, 'Ah, but it's just the way things are going. There's no point of--' Is it [?]? And, actually, it's naïve, and it's sort of--it's illogical. It doesn't make sense. You know. And, it sort of takes out our agency, our human agency, where we go, 'Well, hang on a minute. It's up to us here. We don't have to just roll down and roll over and let this happen.'

The other--kind of, I don't know how much you want to get into this--but the other area I look at briefly in my piece was politics and that kind of public discourse. And, the way that, even there, you can see a kind of roboticized mode of discourse, particularly, on social media, of course. And, just look at people on Twitter: I'm like, a bot could very easily imitate you. Sometimes, you'll see controversial posts and you'll look at all the reactions to it and the people are all saying the same thing. And, sometimes, the point they're making is a perfectly good one, but it's just remarkable how everybody responds with the same point of using the same phrase, the same words.

And, again, I'm just struck by the fact that it's not necessarily that bots are getting so much better at imitating us. It's just that the first thing that's happening is that we are becoming more bot-like. We're kind of making it easier for them.

And, sometimes, on Twitter--and our political discourse more generally--I do feel like we're becoming more binary, more algorithmic, more kind of everybody has to fit into this box or the other, otherwise, the system just doesn't understand you.

Russ Roberts: Yeah. I think it's worth thinking about the deeper question, what it means to be a human being. I think about my conversations on WhatsApp with my wife, say, or family members. I have certain icons, emoticons, I like to use in response to things that they post. And, I'm sure after a while people know--my kids and my wife know--what I'm going to respond with. They could do it for me. And, hey, I could get a bot to respond for me, so you wouldn't even have to read those posts for my kids and wife. I could just be happy putting my thumbs up or my little party happiness or my little happy face with the three hearts. And--

Ian Leslie: Yeah, I think it's a great point. Yeah.

Russ Roberts: I think if we're not careful--and I think, I love WhatsApp, by the way. I think it's an extraordinarily nice way to interact quickly and with a little bit of emotion with the people I love and can't be with every minute of the day. But, if that's all we are, it starts to remind me of the Nozick experience machine where you hook yourself up to a machine and you experience--what feels like reality, that you're doing whatever you pre-program it to do. You can pre-program it to become a great rock star or you cure cancer or you become President or you win a Nobel Prize--whatever your dream is. You win the World Cup representing Argentina wearing number 10.

And, you experience all the emotions of those experiences, but they're not real. But you can't tell that. They feel real to you.

And similarly, I could WhatsApp back and forth with my wife for 10 years, or my kids; and perhaps they're just using a bot, not really reading what I say; and it feels real. And you could argue that's okay. But, I think deep down we all know it's not okay. That what--

Ian Leslie: It's not okay! It's not okay. Though, is it? Yeah.

Russ Roberts: It's not. And, I think it's not who we are.

Ian Leslie: But, I mean, but I don't want to sound too doleful about it because I'm not. I see it as more like a bracing challenge to us. Right?

And, it's great that you've taken it right down to the level of our kind of everyday lives. I think that's exactly right. When you get an email from somebody, maybe it's a lovely, very human email from a nice person and at the bottom you get a series of options from Gmail saying: Do you want to say, 'Great job?'--

Russ Roberts: 'Me, too'--

Ian Leslie: Or whatever it is. 'Me, too!' And it's, like, 'Okay.'

Now, the interesting thing about that kind of thing is, like, maybe you were just about to reply with something equally generic. So, maybe you want to think twice about that and be a little bit more, put a bit more thought into it, and be a bit more human.

And so, that's the way I kind of see this whole question, really. Not as, 'Oh, it's terrible. We're so boring and we might as well just outsource everything to the machines.'

I see it as: these brilliant machines, brilliant technology, throwing the question back to us, going, 'Okay, so what have you got? You do better.' It should be a raising of the bar. They're raising the bar for us and vice versa: by being more human, we make it more difficult for them to imitate us, so we raise the bar for them. And, that's a more, kind of, virtuous loop than the one that I think we're in right now.

52:16

Russ Roberts: Yeah. It's a fascinating thing to think about, though, which I had not until this moment. Somebody sent me recently a Keynes/Hayek rap video written by ChatGPT. So, John Papola and I wrote, quote, "the real one," about a decade ago--a couple of them. And, of course, the more of those we write and the cleverer they are and the more human they are, the more material AI has to rip us off. You know. So, the version they sent--somebody sent me or posted on Twitter--is, like, it's pretty good. It looks like they stole a couple of lines. They have literally stolen a couple of themes from us.

And, the part I think we have to confront is that I didn't think that would happen in my lifetime.

I didn't think there would be poetry or song lyrics or anything remotely as clever as what I'm seeing in this incredibly embryonic phase of this revolution. And, as we try to match them and outdo them, they do learn from us and they will get better. And, that raises the bar even higher.

And, I think the more important point--now, if I think about, I've written many poems for my wife on her birthday or on Mother's Day, or my dad used to, when we'd come visit him, he'd post--he'd write dozens of poems and put them all around the house for us, for each kid and our kids. And they were lovely. And they moved us.

And, if you told me, 'Oh, yeah, well actually he paid somebody to write them for him,' or 'He outsourced it to an artificial intelligence agent,' I would be sad. They might be better, but they would not be his.

And, to me it's not so much that these technologies challenge us to be cleverer or more innovative. It's: They remind us that they want--we want--reality. I think you're not an avatar. I think you're actually Ian Leslie. I read your essay. I think you wrote it yourself. But, if you said to me, 'Actually, I didn't write that essay. I asked the ChatGPT to write up an essay on the implications of ChatGPT for music. And, it was so clever: it came up with this thing I wrote.' And, I might praise you. You might get honor and glory for it even though it was written by a machine. But it wouldn't be you.

And, for me, a lot of what this is telling me--what I'm taking away from this conversation and your essay--is not so much we have to be better than the machines. We just have to be us.

I think authenticity is--it's going to be harder and harder to tell what's authentic. And, that to me means that a lot of these back-and-forths will not be as emotionally powerful as they used to be, because we'll assume, correctly, that many of them were not written by a human being. They were written by a robot, some AI.

And so, to me it's going to put more of a premium for us having a drink together. And that's okay. I think that's--it's reminding us that that is what really matters. Not how clever our words, not how creative, but they're ours.

Ian Leslie: Beautifully put, and I really couldn't put it any better. As you say, it's not about being better. It's about how can we be more us? How can we be more human? And, really, my essay is just a call for us to actually once again return to that question. It's been asked for a long time. But, in the light of what's happening now, yeah, I think we need to ask it with more urgency than ever before.

Russ Roberts: My guest today is Ian Leslie. His Substack is The Ruffian. And, thanks for being part of EconTalk.

Ian Leslie: Thank you so much, Russ.


More EconTalk Episodes