Leonard Wong on Honesty and Ethics in the Military
Apr 27 2015

Leonard Wong of the Strategic Studies Institute at the U.S. Army War College talks with EconTalk host Russ Roberts about honesty in the military. Based on a recent co-authored paper, Wong argues that the paperwork and training burden on U.S. military officers requires dishonesty--it is simply impossible to comply with all the requirements. This creates a tension for an institution that prides itself on honesty, trust, and integrity. The conversation closes with suggestions for how the military might reform the compliance and requirement process.

RELATED EPISODE
Paul Bloom on Cruelty
Yale University psychologist Paul Bloom talks with EconTalk host Russ Roberts about cruelty--what motivates cruelty, the cruelty of small acts that accumulate into something monstrous, and the question of whether the abuse of a robot is a form of cruelty.
EXPLORE MORE
Related EPISODE
Christopher Coyne on Exporting Democracy after War
Christopher Coyne of West Virginia University and George Mason University's Mercatus Center talks with EconTalk host Russ Roberts about his book, After War: The Political Economy of Exporting Democracy. They talk about the successes and failures of America's attempts to...
EXPLORE MORE
Explore audio transcript, further reading that will help you delve deeper into this week’s episode, and vigorous conversations in the form of our comments section below.

READER COMMENTS

Gene Boyer
Apr 27 2015 at 8:28am

Russ, bath salts is a generic term for synthetic drugs designed to circumvent drug laws. The only relation to actual bath salts is they often appear similar. http://www.m.webmd.com/a-to-z-guides/features/bath-salts-drug-dangers

Brant Gunther
Apr 27 2015 at 10:56am

As an Army officer and voracious consumer of Econtalk, I especially appreciated this discussion. I will try to truncate my comments, as this is the topic of many weekend gripe sessions in the backyard.

I feel I have a unique vantage point on this topic as I was a ROTC cadet and commissioned before 9/11. I was a company grade infantry officer for eight years during the Global War on Terror, and now am a reserve component field grade logistics officer. I currently own a residential construction company. Navigating Army regulations is exponentially more onerous than OSHA/EPA/building codes.

My peers and I have always felt that the requirements placed in our rucksack were almost impossible to complete given the time and resources provided. We were often told by our superiors that we were being paid to “figure it out”. The consequences for not completing mandatory training/events were detrimental to a young officers career. We simply had to prioritize and take a risk.

Many times we felt that serving in a combat zone was a break from the “garrison” requirements of an Army in a war footing, but not deployed. The stress of preparing to deploy, spend fleeting time with family and accomplish the ancillary requirements at home station was overwhelming.

After spending a considerable time in the inactive reserve, I joined the reserve component as a drilling member. What I didn’t appreciate while on active duty, was nearly all the requirements are the same for active and reserve units. The reserve units have a little over 10% of the time to accomplish them.

@ DR Wong, was there much mention in your discussions with other officers of the increasing “AR 350-1” requirements as our force changes?

@ DR Roberts, as per SOP, another outstanding podcast.

Jason W
Apr 27 2015 at 11:40am

This is great; applies to large corporations too.
A lot to think about here. Thank you.

Jim Kennedy
Apr 27 2015 at 12:53pm

Fantastic talk. Lots of great things to learn here. The application is far beyond the military. I was talking to a retired public school elementary teacher. She had taught special needs children for 39 years. The paper work went from a couple of pages per child to about 80 pages per child! (she was subject to that, not her choice) In the Health Care industry going from ICD-9 Codes (billing) to ICD-10 Codes. Instead of an ICD-9 Code for a stitches on a cut finger you have to specify hand and finger. The intent is more accurate data. In all likely hood the Dr. will just pick a hand and finger and always use that. (a lot of right index fingers will get “fixed”). False sense of accuracy.

Great podcast. Thanks everyone.

Frederick Davies
Apr 27 2015 at 3:54pm

That was a great talk. While listening to it, I was strongly reminded about a book I read some time ago which chronicles a similar “wrong military culture created by bureaucracy” and the failures that flowed from it:

The Rules of the Game, by Andrew Gordon (1996)

I hope the US Army sorts this out, because last time this “organically-created culture” resulted in several thousand young men’s lives lost.

FD

David Reed
Apr 27 2015 at 4:00pm

Great episode.

Dr. Wong said imposing more requirements than can possibly be met results from good intentions. But in my experience in civilian Federal government, impossible requirements are often imposed knowingly and intentionally. The official who imposes the requirement can claim to have done something about the issue at hand, and if she knows that those subject to the requirement will report compliance even though it is impossible, then she can be confident her sham solution will not be exposed.

Cowboy Prof
Apr 27 2015 at 4:50pm

Russ should consider playing baseball for his beloved Chicago Cubs because every week he loads up the bases and hits a grand slam!

I loved this episode because of its widespread application. Actually, my one comment might be that you missed on some of the opportunities to see how widespread this phenomenon is. While definitely applying to military culture, and as Jason notes above to corporate culture, it applies widely to any form of government writ large. As more and more rules are put into place, it becomes more and more difficult to both know and adhere to those rules, making it more likely that those rules will be ignored or undermined.

The interesting part of the podcast that I thought was underplayed was how more rules are added to deal with rules that are being avoided, which in turn creates an incentive to create a parallel system of governance. It is as if increasing centralization creates a more market-oriented informal sector — i.e., markets tend to win out! The “informal interpretation” of all these rules (which is in reality an avoidance of the rules) really devolves down to local interpretations of context and certain negotiations based upon the best information available, which kinda sounds like a market.

Trent
Apr 27 2015 at 6:46pm

Cowboy Prof,

Before Russ falls off his chair, it’s his beloved Boston Red Sox. But I agree with you on your point – another podcast that’s a hit!

John Doe
Apr 27 2015 at 7:37pm

This was a great podcast and excellently portrayed, but I cannot help but disagree deep down with Col. Wong’s conclusions. He is certainly correct that prioritization has occurred and direct answers are not desired nor forth coming from the officers. As many other great comments point out, my mind was running through all the other bureaucracies that place unmeetable requirements that don’t enhance real outcomes. This applies to schools at all levels, local, state and federal government, charities, private firms and home owner associations.

But I would not put this as anything new. I question if things are worse than in the 80’s when Col. Wong graduated from West Point. As I have aged and gained wisdom, I see more fubar, but it is with age that one can determine those directions that are individual whims and not logical well thought out orders. It is hard to imagine a time when a chain of command did not defer to individuals in the chain to decide how to accomplish their goals.

That is not to say that Col. Wong also did not hit the nail on the head when he described the electronic signature as too easy to request. But it does not take a 4 star general to figure out if you really want to know if all your enlisted men went to bath salts training, you would require a sign-in sheet, list the instructor and have a verifiable exit exam. My point is the brass does not want to know if men were trained, they want plausible deny ability.

The second part which is more the reality is that good officers at any level know whom in their command is meeting their goals, the detailed rules are mainly used to take formal action on those who are not assets to their commanding officer. This is no different in the non-military world. There is a required social role of saying your following all the rules, but no real mechanism to ensure they are followed. I acknowledge that the more askew-ed the rules to the mission the more difficult the job.

The hand ringing about the white lie erosion, seemed to be possible, but not supported with evidence. There are too many examples of the need to be nuanced in responses to keep projects on track. It is not that their could not be erosion of public trust, but I would say today’s military is head and shoulders in trust above the level at the end of the Vietnam War.

Belief that significant change can occur with programs like Six-Sigma or others is just not present, because in bureaucracies the need to improve is not as strong as the lack of incentives to track or transform, not to mention all the previous programs that have not changed the institution.

I was also finally disappointed that Col. Wong thought that professionalism made the army officers different. These are men that take great risks, but to think they act materially different than other people when facing incentives seemed to wish for a solution instead of trying to find one.

Finally I commend Col. Wong for his courage. I have disagreed with some of the methods, but the strength to write truth in the face of bureaucracy is no small feat. I do believe that the ebb and flow of truth and lies washes back and forth, Col. Wong’s efforts likely will at least temporarily push the tide in the right direction.

Antonio Rafael Garcia
Apr 27 2015 at 11:15pm

The buildup of political strictures within an organization which accumulate like barnacles on a ship and slow it down in response and efficacy.

Mancur Olson, anyone?

And yes, it’s the same way in the US Navy as well… even in the Navy’s nuclear propulsion program where I served. It’s maddening and sad in terms of both emergent culture and actual operations.

ToddR
Apr 28 2015 at 9:01am

Excellent show. The phenomenon discussed applies to civilian/private sector business as well.

Gerard Decatrel
Apr 28 2015 at 10:13am

In training, they should give examples of the types of successes that were achieved due to detailed and correct storyboards as well as failures that could have been avoided if storyboards had been done properly.

When the announcement of a new requirement at an “all officer’s meeting” was openly met with groans, I vividly remember saying that someone should total up how much time all our requirements take and forward it up the chain. The C.O. gave me a look that said “you poor, poor fool . . . ”

When requirements are created to respond to a particular problem (for example drunk driving), the requirements should be reduced if specific results are achieved. This allows units who are able to achieve the results more efficiently avoid the requirement.

There should be a group that all they do is streamline and eliminate obsolete orders. Congress should do the same.

Carl Pearson
Apr 28 2015 at 12:53pm

My own experience is the Navy, and we are certainly no better. Indeed, to echo the gentlemen earlier commenting on the nuclear service: I served in the organization overseeing that community, and the rules were definitely the religion at headquarters. Often, I worry, this dogma was detrimental not just to operational concerns, but also to genuine attention to the terrific technology “under the hood” and thus safety of the crew and proper stewardship of the taxpayers’ equipment.

Hopefully, Leonard’s work will survive the Pentagon’s spin machine to make it out to all of the services.

I also see similar problems in my work now in academia. Some in the scientific community might imagine that these issues don’t effect us: after all, we have a similar pretense of higher moral standards. But I see a disturbing parallel in how universities treat researchers and instructors (perhaps in ways beyond their control, particularly at flagship state schools), and the trickle down effects on publication, peer review, and teaching corrupting the actual work.

Tony p.
Apr 28 2015 at 7:47pm

Econtalk is and has remained awsome for so many years. This specific podcast is another example. Thank you Russ and Col. Wong.

Mateo
Apr 29 2015 at 12:33am

Great show. So many important implications. One that is important to me is that too many rules result mean they can neither be understood nor followed.

“Useless laws weaken the necessary laws” – Montesquieu

Andrew
Apr 29 2015 at 3:56am

One of your best podcasts Russ, Mr Wong is the kind of person who doesn’t get heard often enough (and I know him from only this podcast).

Todd Mora
Apr 29 2015 at 8:31am

Russ,

Another great podcast! Anyone who works in public education, social services, healthcare or other highly bureaucratic organization knows all to well about having to manage overwhelming demands. This occurs when the person creating the requirement has no responsibility for actually doing the work.

Shawn Barnhart
Apr 29 2015 at 12:27pm

It’s certainly tempting to blame this on information technology to a large degree.

Historically the elements of the organization originating reporting requirements were limited in their reporting mandates by their own ability to collect, organize and process them. To be sure, “the home office” often developed extensive bureaucracies to mitigate this, but when only physical documents were an option there were still practical limitations of space, time and personnel to collate and distill this information.

I think those who mandate and consume information technology derived reporting develop an “ease of use” bias — to them, adding an additional field to a report or a column in a spreadsheet is trivial and masks the time and complexity required to obtain and enter each new additional element of data by those who provide this information.

SaveyourSelf
Apr 30 2015 at 4:56am

It seems Leanard Wong has discovered that the US Military has some Monopoly-style outcomes. [Like low quality, low quantity, high price, and absent or negative incentives to change]. He does not use those terms, but he gives enough example and analogies that I think his concerns fit that spectrum. His goal is noble—remove that which rewards dishonesty from the military. And his solution—get every level of the military hierarchy to admit there is a problem—is likely to help, but unlikely to solve the problem satisfactorily.

The solution to the problems created by a Monopoly are 1) competition 2) diffusion of knowledge [honesty, transparency, common language, etc] 3) voluntary participation 4) Justice.

With regards to those solutions, the US Military has a lot going for it. 1) For the warriors close to real or potential fronts, there exists real competition. 2) Each member of the military has access to all the information present in the American economy. 3) It is an “all-volunteer service.” 4) The Military has a long history of equal application of Justice to all its members—called the “Uniform Code of Military Justice”. Despite all that, the US Military still has Monopoly-style problems. How come?

First, during times of peace, there is little true competition. True competition is necessary to determine which ideas are useful and which are crap. Without real competition, the ideas are all crap or all useful, depending on whether you are making the decisions or receiving them. Second, distance matters. During war-time, Generals live in the field with their soldiers, which is ideal because it allows them to physically see the outcomes of their decisions. Their knowledge is local in that the decision maker and those affected by a decision are in close proximity. During times of peace, however, those same Generals live and work in Washington DC. Which is why the Pentagon is such an absurd concept. Third, once enlisted, the activities for soldiers are no longer voluntary. Soldiers can literally be executed for failure to obey directives. The “All volunteer Army” is not so voluntary after all. Forth, knowledge is often horded. Transparency is frequently abandoned behind a “need” for operational secrecy.

So how to improve the current US Military? Probably the simplest way is to make obedience to any and all orders during peacetime voluntary. Thus every order would have to carry with it its own justification for existence. Stupid orders would be ignored/abandoned/discarded/rejected. They would fade away quickly along with the people who gave them. Another simple solution is to increase peacetime competition. Have more war-games with foreign militaries. Break the domestic Military up in to small, independent groups which compete with each other on a regular basis. Publish the results of those war-games for everyone to see. Finally, flatten the military hierarchy. Bring the upper tier decision makers closer to the frontlines. Have fewer upper tier decision makers. Get out of Washington DC.

Tel
Apr 30 2015 at 5:14am

[Comment removed pending confirmation of email address. Email the webmaster@econlib.org to request restoring this comment. A valid email address is required to post comments on EconLog and EconTalk.–Econlib Ed.]

SaveyourSelf
Apr 30 2015 at 5:22am

Cowboy Prof wrote, “The interesting part of the podcast that I thought was underplayed was how more rules are added to deal with rules that are being avoided, which in turn creates an incentive to create a parallel system of governance. It is as if increasing centralization creates a more market-oriented informal sector.”

  • Thanks, Cowboy. I’ve never thought of it quite that way. Your description reminds me of some advice I got when I was young from my father along the lines of “never piss off a secretary. The informal secretary network is the most efficient way to get work done. Piss off a secretary, and life becomes very slow and very painful.”

John Doe, “…the detailed rules are mainly used to take formal action on those who are not assets to their commanding officer.”

  • That has a strong ring of truth to it. It leaves me feeling cold and vulnerable though. I don’t think I would like to live in the world you are describing.
Gandydancer
May 6 2015 at 6:02pm

SaveyourSelf writes:”[Wong’s] solution—get every level of the military hierarchy to admit there is a problem—is likely to help…” Sorry, no chance, no way. The military is as top-down an organization as exists, and blaming the lower levels AT ALL for dishonesty when dishonesty is what is rewarded is a cop-out. The only solution to required dishonesty is to not require it, and there’s fat chance of that when the brass in charge are those that advanced by being the most effectively dishonest. Think of who got to be in charge of the institutions responsible for vetting the value of derivatives — those who could inculcate the most irresponsibility in their subordinates. Same phenomenon.

ps – Did Wong actually get training on “bath salts”, or just hear about it from individuals who hadn’t taken it in or didn’t communicate what they’d learned? “Bath salts” entered my consciousness when the allegedly hopped-up guy in Florida kung-fu’d the cops and ate the homeless guys face before being shot to death, but neither Roberts nor Wong seem to have encountered accounts of that event.

I like the observation upthread that all these bogus requirements are bureaucratically useful in getting rid of disfavored subordinates by selective audit.

The main takeaway perhaps should be that the honor culture of the military is at least partially a myth (and likely to remain so), but perhaps a useful one. But it may be doomed to get less and less plausible. Anyway, the whole country is succumbing to organizational sclerosis and I don’t know that the downward spiral in military effectiveness will beat any other factor to the crash.

Krzysztof Ostaszewski
May 9 2015 at 12:11pm

One of the best EconTalks ever. Absolutely outstanding. Excellent. Summarizes the essence of the changes that happened in American culture over the last quarter century.

Tarun Abhichandani
May 15 2015 at 3:44pm

Thanks for a great talk. Would it be possible to share the paper authored by Leonard Wong and Stephen J. Gerras? I am not able to access the link outside of US – it is blocked.

Thanks a bunch,
Tarun

Dan T
May 16 2015 at 5:39pm

Regarding negligent discharges – I was a non-commissioned officer (NCO) in Afghanistan. While preparing for a patrol, one junior servicemember had a negligent discharge(ND) within our base. There were two things that were true regarding this ND. First, it is a serious matter. Someone could have been seriously injured, and fortunately no one was. Weapon control is one of the first things hammered in to us. Second, if this went through formal disciplinary channels, this servicemember would face a disproportionate amount of punishment than would have been warranted.

The ND occurred near our command hut. In about 10 seconds every officer and senior NCO was at the area asking questions. I gave an obvious lie to this group. I stuck with that story while being questioned from every angle about the incident. Everyone knew what I was doing. And that was the story that stuck. Later that night the servicemember had some corrective instruction in the proper use of his weapon. He also “volunteered” to fill 1000 sandbags over the course of the next week. He never had a problem with ND’s again.

Brian
May 20 2015 at 6:17pm

I remember being upset at the rules of engagement issued to our soldiers in the field that changed under Obama. Our own police forces did not have even that strict of limitations in hostile situations. I was so upset at how the rules of engagement were endangering out soldiers lives in combat zones with hostile populations.

Then a marine stated telling me war stories. He basically told me that they ignore the rules of engagement and due what is safe and the officers take care of the paperwork. Basically someone with a cellphone on a roof or on a dirt bike are recon units. The rules of engagement said you could not fire on them. They would do it anyways because the only people who did those activities were enemy soldiers. So anytime they saw behavior that experience said was a prelude to a firefight they would shoot first and the officer would take care of the paperwork.

So when you have rules and regulations that put you and your mates needlessly at a greater risk you are not going to follow them. Any decent moral person who cared about their fellow Americans lives would do the same. However it does become a more slippery slope because justification at lying on paper work to keep the men in your unit safe can then be used for less serious cases.

The same Marin told me many night they would see Afghan solider bring little 9 and 10 year old boys into their camp at dusk. They would then here the little boys screams all night while he was rapped. They then would have to go on patrol with the solider the next morning and have to have extreme self control because all they wanted to due is kill the Afghan soldiers who they knew were involved with the rap, the next day.

So I think the military brass has lot bigger morality problems on their hands than lying about paperwork being filled out correctly.

Comments are closed.


DELVE DEEPER

EconTalk Extra, conversation starters for this podcast episode:

About this week's guest:

About ideas and people mentioned in this podcast episode:Articles:

Web Pages and Resources:

Podcast Episodes, Videos, and Blog Entries:


AUDIO TRANSCRIPT

 

Time
Podcast Episode Highlights
0:33 Intro. [Recording date: April 15, 2015.] Russ: Leonard Wong, along with co-author Stephen Gerras, is the author of "Lying to Ourselves: Dishonesty in the Army Profession," which is our topic for today. Lenny, welcome to EconTalk. Guest: Thanks, Russ. Russ: I'm going to start with a quote from the summary of the paper. It says the following:
This study found that many Army officers, after repeated exposure to the overwhelming demands and the associated need to put their honor on the line to verify compliance, have become ethically numb. As a result, an officer's signature and word have become tools to maneuver through the Army bureaucracy rather than being symbols of integrity and honesty. Sadly, much of the deception that occurs in the profession of arms is encouraged and sanctioned by the military institution as subordinates are forced to prioritize which requirements will actually be done to standard and which will only be reported as done to standard. As a result, untruthfulness is surprisingly common in the U.S. military even though members of the profession are loath to admit it.
And the summary closes
The Army profession rests upon a bedrock of trust. This monograph attempts to bolster that trust by calling attention to the deleterious culture the Army has inadvertently created.
So, before we delve into the paper, I'd like to get some of the background. How did it come about and where did the evidence for the claims that you make in the summary--where does that come from? Guest: Well, the study started actually almost a decade ago. I did a different study looking at innovation in the army's junior officers. And at that point I went around with a team of officers; we tried to catalog every single requirement we put on company commanders. Those are junior officers who lead about 150 people. And what we discovered is that there's just too many requirements that can be literally accomplished in a given time. So, we said, if you add up all the requirements, it exceeds the amount of time that company commanders have to execute them. And so that was the end of that. But always in the back of my mind has always been--so, what happens? They can't do it all. Russ: How do you make 25 hours go into 24? Guest: Exactly. So, how do we deal with all this? So that's always been in the back of my mind. But it wasn't the primary focus of that study. And so then the question became: So what do we do? But then I was also reflecting personally, in that the army, like the rest of American society has become an audit culture--in other words, we measure everything and we audit a lot of things. So I am subjected to a lot of things like: Have you read the Information Awareness 1900 Board Permission before you get on your computer? And I always put: Yes, I have. When I really haven't. Russ: You're the only one. It says, by checking this box you agree that you have read all of it. Guest: Exactly right. Or, a firm in England that they offered free WiFi, and they put out--this is an experiment they did, and part of the agreement was, 'I agree to give over my firstborn.' And everyone signed it. So it's very common. It's not like the army is so weird. We're just like society. And the army has created that. And so what I discovered was: Are we kidding ourselves? And so then my colleague Stephen Gerras and I said, 'You know, we should really go around and ask people how they are dealing with all of these requirements.' And so we interviewed captains, which are junior officers, at Fort Benning and Fort Lee. We interviewed majors, which are mid-range officers, at Fort Leavenworth, Kansas. And then colonels and lieutenant colonels here at the Army War College in Carlisle, Pennsylvania. Then we also went to the Pentagon. And listened to all those people. And it was very eye-opening, and very disheartening. And yet there's a lot of goodness out there, too. Russ: And how many people did you interview? Roughly. Guest: It wasn't really interviews. It was discussions. And so, we didn't sit down and force people. But roughly, I'd say about 120. Russ: And when you talk about requirements, I think in everyday English we would call them rules and regulations. These are a mix of compliance forms that people have to fill out that attest to having accomplished some task or having something on hand. Give us a measure of the range of paperwork and requirements that we are talking about. Guest: It could be--we call it 'directed training' is where this whole thing starts. It goes beyond that. But it starts with directed training. So, directed training would be everything from you need to be qualified on your weapon to you need to sit through a class on sexual assault, or you need to view an online course on the dangers of bath salts. Russ: Did you say 'bath salts'? Guest: Bath salts. Russ: Is that a joke or is that serious? Guest: That's serious. Russ: What's the worry there? Guest: Because the use in America are--being afflicted with, being enamored of bath salts, using those as artificial stimulants. Russ: Okay. Guest: Someone said, 'We've got a lot of young people in the army. They need mandatory training--make it training requirement--that they have to have annual training on bath salts. Or human trafficking, or the dangers of, you know, cyber-awareness. Or anything. The problem is the army is--we like to create requirements. We're great idea-generators. And so more and more of these requirements get placed on those at the bottom. But none are ever taken away. Russ: Yeah. So this includes things that are directly relative to combat, things that relate to the culture of the unit and the organization itself. It's also, having read the paper, it's also things like supplies on hand, what you do with cash you were given. So these are both verifying that things happened plus keeping track of stuff, right? Guest: Exactly. It's a wide range. You have your administrative things, but you also have things related to the mission, related to--are you taking care of your soldiers? And so, it's the system trying to do what leaders normally do, but then it just becomes overwhelming to those on the bottom. And so what happens is that those on the bottom are left making decisions: Which ones am I going to do? Or, which ones am I just going to say I did, because I can't do them all?
7:22 Russ: And give me your overall impression. Again, it's by definition anecdotal. You didn't survey 25% of the U.S. military. Guest: Right. Russ: You talked to a bunch of folks. How representative are the stories you talked about in the paper of the 120, first of all? And then, how representative do you think those 120 are of the military as a whole? Guest: I think it's fairly representative. Actually I think it's very representative. And the reason I think that is because after the paper came out--now[?] we're getting into a different topic here, but it was very interesting. Because at first, exactly as predicted, it really riled some people. Because who wants to be told, 'We routinely evade the truth'? Nobody wants to be told that. But then as time went by, then suddenly people started to not deceive themselves and say, 'Yes, we've fallen into this, just like the rest of America falls into it. But we hold ourselves to a higher standard, and we're letting that standard go.' So then people just began talking about it. And now it's being pushed not by the bottom, but now it's being pushed by those on the top. And so, is it representative? I think it's a pretty good indicator it is representative, because now it's being pushed by the highest levels of leadership, not just those on the bottom. Russ: And I guess in some empirical sense the fact that compliance, 100% compliance, is impossible; and yet you routinely discover that--you routinely find that in the reporting--suggests that something has gone wrong. Guest: Right. If compliance is impossible then all these good stats can't be--something is wrong. Russ: I find it fascinating--you talk in the paper about, that most of the people, when you originally raised the possibility that there's this dishonesty going on, they are very hostile to that. They bristle, they push their defensive. But then it emerges that, 'Well,...'. So, talk about that process, how that would come about in the conversations. Guest: Well, like I said, these were discussions. They really weren't interviews. It would be a group of 8 people or 10 people, and we'd talk about how many requirements there were out there, and there's so many. And they'd all agree with that. And I'd go, 'So, do you guys ever lie?' And right then, you could hear a pin drop. Because who in the world would ever say, 'Of course I do'? And that's what I was trying--and they'd say, 'Of course not.' Especially army officers. Army officers are very moral people, because the business that the army is in, you cannot have people that you don't trust. And so then-- Russ: Lenny, you're a former army officer yourself. Guest: Yes. Russ: That's important. Right. You're not like an academic who has never been in the army. Guest: I'd like to say I'm like you, Russ. Russ: Sorry. Go ahead. Guest: So, army officers have a moral identity of extremely high because trust is critical to soldiers and trust is critical to society. The whole profession is built on trust. And so, for me to say, 'Do you guys ever lie?' it's like--'Why are you asking that?' And then we start saying, 'So how do you deal with all these requirements?' And then they start saying, 'Well, you've got to learn to prioritize.' What does that mean? So then we'd push it further and further, and it came down to--literally I had some people put their head in their hands and say, 'Okay, fine, I lie.' But nobody wants to say it. Because what we do is we cover it up with euphemisms. Or we offer rationalizations. And what's going on is there's something that Ann Tenbrunsel out of Notre Dame came up with, a term called 'ethical fading.' And what ethical fading is, is when the moral implications, the clear 'this is right and wrong,' when all that starts fading into the background and it stops being an ethical decision--it stops being an ethical dilemma--and then turns into an administrative decision or a way to do business. And so when we click on 'Agree' that we just read the 16 pages of lawyer-speak when we really didn't, we don't think of that as lying. That's just what everyone does--it's the way you do business. That's the way you get by[?] free WiFi. And it's not an ethical dilemma.
11:50 Russ: Nobody's hurt by it; in fact, you give a lot of examples in the paper which long-time listeners will recognize as classic cases of self-deception, an issue we talk a lot about here on EconTalk. Which is: Well, I didn't do it for me; I did it for the troops. Talk about some of the examples, some of the rationalizations and why that's problematic. Because, you could argue they are just [?]--this is staple stuff in war movies and TV shows--the commander [?] stand up. Guest: The persona of an army officer is a selfless servant. And usually army officers view themselves as focused on two things: the mission and the troops. And everything we do is geared for the mission and the troops. And so, if you put down a deluge of requirements then you are harassing the troops and we can't get the mission done. So, my job as a leader is to form a shield around the soldiers and say, 'You know what? We're not going to spend time on what I call dumb requirements.' So I'm going to say that we did them. And then we'll move on and do the important things. So right there, there's a crack in the ethical framework. But we don't view it as lying. We just view it as protecting soldiers. Or, for a really stark example, I had one captain tell me that, you know, an IED, an Improvised Explosive Device, went off, injured both lieutenants; and you are supposed to fill out a report saying how big was the blast, how far away were the people from the blast, so you could measure traumatic brain injury. And he said, 'You know, I falsified that because I didn't want to leave that unit without leaders. Because they would [?] evacuate them if I put the real numbers in there and I didn't want to have my soldiers without leaders.' And so, he said, 'So I told a lie on that one.' Other officers said, 'You know, we couldn't get hot showers for our soldiers in Afghanistan. The only way you get them is to get this money, and finagle it through the system, and it wasn't right but we got hot showers for the soldiers.' So those are--you could sympathize, you could empathize, but what it does, it creates a culture where ethical fading kicks in and you start becoming numb. And then the dangers of all that start kicking in. Russ: The word I liked that you used, that they would use, is 'prioritizing.' Or 'triage' would be another word, where I can't do everything, so I have to make a choice. And to hear them tell it, naturally they err on the side of the troops: If I have to make a prioritizing decision, I'm going to make sure they get the hot showers; I'm going to make sure that the lieutenants are still in place. But that of course leads to potentially destructive decisions in other situations, where rationalization then is letting people do what's in their own self-interest but masquerading as if it's for the good of the group. Guest: Right. Exactly. The downside--because when you first hear that, that all just makes total sense and we all just say to ourselves, 'Well, that makes sense, that you deal with it way. But what really happens-- Russ: And you could argue that they did the right thing. Guest: Exactly. Yeah. Yeah. Wouldn't anyone do that? Russ: They faced the dilemma. Guest: But when you really think about it--and what's nice is that, in the discussions that came up, they said, 'Well, we understand that, but here's what's going on.' I didn't come up with these, but they pointed out that when you start allowing that type of reasoning, that type of ethical fading, the question becomes then: who decides what is right and wrong? Because in some units, they would say--we have something in the army where if you have a negligent discharge--that's where your weapon goes off inadvertently in a combat situation, like when you are back on the base and you are not supposed to have rounds in the chamber and it goes off--some units would say: The rule is, report that. Because we can't have that happening. Some officers said, 'Well, we wouldn't report it. We would just cover it up.' Other officers would say, 'You've got to be kidding me. That's a breach of discipline. You have to report that. If you don't report that, something is seriously wrong.' And what that shows is, as soon as you start allowing these types of ethical fading decisions, the question becomes who decides which ones are allowed and which ones aren't. Because everyone becomes an expert then. Everyone gets to judge. And then there is no standard. The other thing, it literally puts you on that slippery slope--is that it suddenly becomes it's an example that other people will use to justify their decisions to not tell the truth. So then it heads down the slippery slope. And a third one is exactly what you pointed out: It masks those people who are making unethical decisions because of self-interest. Because you say, what's the difference between a person that did that for themselves because they wanted to look good, or did it for the troops or did it for the mission? But then looking at the big macro level, what it really does is it's the seed that starts feeding this hypocrisy in the profession, where we think we are so ethical, but in our day-to-day dealings we fall victim to this ethical fading. And that hypocrisy, we can't feed it because it just creates this false image of ourselves where we deceive ourselves. Russ: Yeah. I want to come back to that later because I think there's a whole bunch of interesting issues here related to how culture emerges in an organization like this, of this magnitude and size and tradition. There are a lot of currents running underneath the surface that are going to be affected by these changes. I just want to go back to the negligent discharge example for a second. So the people who would say, 'Oh, I covered that up'--they would justify it, I assume, by saying, 'I don't want this person below me to be punished.' Guest: Exactly. Russ: It was just bad luck, it was just an accident. Guest: It was an honest mistake. Russ: But it also, I assume, reflects badly on the officer when there's numerous negligent discharges under his command. So as a result, his own self-interest there is coming into play. Guest: It could be. Oftentimes it's not. I talked to some people saying the unit that was replacing them came in and one of them had a negligent discharge; but they said, 'You know what? We don't want to get these guys in trouble and they haven't even started yet. So let's not report it.' Russ: Yeah. Well, that's dangerous, obviously.
18:11 Russ: Let's talk about storyboarding for a minute. Explain what storyboarding is for those of us outside the military. Guest: A storyboard is--it used to be that when you came back from a mission, you would debrief somebody so they could add it to the files of intelligence. So every time the next person would go out, they wouldn't go out cold. They would know something about it. So that's evolved into using PowerPoint to show graphically what happened, to put in pictures, to put in narratives. So to create a storyboard of this is what happened when we went out the wire of the base, and this is who we ran into, and then this happened and this happened. And it's required in Iraq and Afghanistan for leaders who would go off base. And so what happened, though, is that started becoming an administrative burden. And so you'd have leaders going out and nothing significant in their mind would happen. And yet when they came back, they said, 'You need to create a storyboard.' And it would take up their time, and so there was a tendency for some officers to say, 'Okay, this burden is going to keep me from preparing for my next patrol. So, I'm going to use information I collected from the last storyboard I created.' Or 'I'm just not going to tell them about what happened, the insignificant things on this patrol that happened.' And so storyboarding became very--it's really a visceral thing to a lot of officers because a lot of people know exactly what I'm talking about when I say, 'So tell me about storyboards.' They know either-- Russ: They look down at their shoes. Guest: Yeah. Or others would say, 'Yeah, that's crazy that some people didn't tell the truth on those.' Because some people said, 'No, those storyboards must be--it's critical those things must be honest, because we're talking about intelligence, we're talking about--you know.' But the other people, it's like, 'There was such an admin burden and nobody cared. Who knew where they went because I never heard any feedback whether they were good or bad.' So, it was a good example of some people who said you'd never tell a lie with a storyboard and other people saying 'They don't go anywhere, don't worry about it.' Russ: It reminds me of a friend of mine in the sales business who would complain about his superior; he would always, he'd have to write up every sales call, every sales report. And he says, 'I'm trying to make sales, and I've got this burden of paperwork to keep me from doing more business.' So, I don't know how honest some of those reports were, but what strikes me about the storyboarding example in the military is, you know, when you talk about the fog of war, and you read about any classic battle--I always think about Lee at Gettysburg, where he only has the vaguest idea of what's going on. His cavalry is--what's his name--J.E.B. Stuart has gone off somewhere. That's his sort of social network and he's been stripped of that. So he's trying to piece together from all kinds of noise--he's trying to clear out the noise, trying to figure out what's actually going on in the battle. And similarly when we're trying to evaluate how something is going in this complex situation, say, in Iraq or Afghanistan, this is the fundamental information on the ground. It's what Friedrich Hayek called particulars of time and space and place. And it's very valuable. But of course if it's not accurate, and worse, if it's being cut and pasted from previous episodes, it's-- Guest: Exactly right. Russ: I've got to read this quote from the paper, from one of the participants, who's justifying being dishonest on this. He says:
Where do the story boards go? They're going to [a] magic storyboard heaven somewhere where there are billions of storyboards that are collected or logged somehow? After doing hundreds of storyboards, I honestly can't tell you where any of them go. I send them to my battalion level element who does something with them who then sends them to some other element who eventually puts them on a screen in front of somebody who then prints them out and shreds them? I don't know.
So that's the--down on the ground, that's his justification for: what's the point? It ends up in no-man's-land. There's no reason to be accurate. Guest: Right. And so the distance from the consequence of being unethical is so far that they don't see it as an ethical decision. And that contributes to the ethical fading. An important thing to note here, though, is that all these requirements to do something, they are all well-intentioned. Somebody someplace said, 'You know, we have a potential problem we need to solve, and the way we could do that is by creating a requirement.' And the problem is that we have so many people doing that, and no one saying, 'Stop. There's no appetite at the present for these good ideas,' that people down on the bottom, their shoulders start aching from carrying so many of these requirements. But they are all well-intentioned. There is no evil person, no dishonest person, no person with mal-intent trying to create a culture that does this. That's the irony. Russ: Yeah, and I have some contact--I just want to say, I spent a day at West Point talking to the Econ professors there and teaching a few classes, and I was just struck by how incredibly devoted the faculty was to the students' learning something. I wish I could say it was commonplace in the non-military institutions I've been in. It's a rarity, unfortunately. And they cared deeply about the outcomes. And I'm sure the people who have put many of these procedures in place, there's a good reason for every one of them. But there's not necessarily someone who is wondering about the entire, the universe of burdens that are being put there. Guest: Right. We forget that we are dealing with humans. Russ: That's [?] important I think to remember, is that, I think most people assume when regulations are put in place, whether it's the Environmental Protection Agency or the Occupation, Safety, and Health Act, OSHA, or any kind of law, any kind of legislation, that regulates behavior either as consumers or sellers--everybody, strangely to me, assumes that they are enforced 100%. And complied with. And one of the valuable lessons here is just to keep in mind that, you know, you mentioned earlier it's something of a microcosm for societies as a whole. There's no doubt that most of us are breaking some kind of law at some point in our lives. We're having this conversation on April 15th, which is Tax Day. Guest: That's a good point. Russ: And there have been studies--I don't know if they are done for humor or seriously, but I'm not surprised if they are true, serious reasons--where they ask tax experts to fill out somebody's tax return and maybe 8 people, and 7 of them will come up with totally different answers for that 1 set of tax facts. Guest: Right. Russ: Because the law is vague. And we're not talking about dishonesty. We're not talking about compliance, honesty. We're just talking about complexity. And I assume that's part of the story, too. It's not just: This is a pain in the neck; I'm just going to make this up. There are probably times when it's not obvious what the answer is, and they just check a box because you've got to check a box. Guest: Right. But even there, you are heading into rationalizing. Because it's so complex. And sometimes, though, it's not as complex as we want to say it is. We just say it's so complex. And the fastest way through it is just to check a box. I think a more blatant example that is happening just today is, if you look at Atlanta, we have teachers there that felt the pressure that you need to turn these schools around. And so they convinced themselves that maybe the best way to do that is through cheating. That's an exact example of well-intentioned rules, requirements, good people, and yet what happens when their ethics go unchecked. Because they convince themselves that they are not doing anything wrong. Russ: Yeah. They are doing it for the children. Guest: Yeah. Russ: Really is very dangerous.
26:37 Russ: I just want to read one more quote, which is really special. This is, again, where after some conversation people would concede that something wasn't quite right. And here's the quote:
Likewise, most former battalion commanders admitted that, in their roles as data receivers, many of the slides briefed to them showing 100 percent compliance or the responses given them for information requests were probably too optimistic or inaccurate. For example, one colonel described how his brigade commander needed to turn in his situation report on Friday, forcing the battalions to do theirs on Thursday, and therefore the companies submitted their data on Wednesday--necessitating the companies to describe events that had not even occurred yet. The end result was that, while the companies gave it their best shot, everyone including the battalion commander knew that the company reports were not accurate.
Guest: Yeah. And that's actually not that unusual. Because if it's due on Friday, everyone backs it off a day. And it just shows how quickly we could say, 'Well, that's a dumb requirement and so you don't need to be truthful in it.' The rationalizations come very quickly on that one. We also went to the Pentagon and talked to the receivers of information there. Now, there's people who got the reports and said, 'How much do you believe the reports?' And there they said, 'Well, you know they gave it their best shot but we know it's not always true.' And so we ask them and said, 'Well, how do you know that?' And they said, 'We used to be there. We were not born yesterday. We used to be there. We used to do the same thing.' And that's what became obviously. Like, what have we created? This façade of, I'll tell you the truth, I'll tell you what I think you need, you want me to say; I'll tell you the truth on the other things, but other things I think I'll fudge, I'll massage, I'll hand-wave. And then the people receiving it say 'I know that you did that, because I used to do that.' But we all go on our own ways, and the briefings happen, and everyone goes away happy from the meetings saying everything looks fine. Russ: I was reminded of the former Soviet Union where the joke was: We pretend to work and they pretend to pay us. But the real analogy there would be--and I've talked to people who lived in that society, in that culture. And there'd be a factory. And everybody understood, all the way up the line, that the output of that factory was not accurate as was listed in the reports. So the people on the line would be asked: 'How many nylon stockings did you create this week?' So, they'd lie, because they had to meet their quota. And they'd tell that to the foreman or the manager, who would pass it on up to the factory, who would then claim a certain number of units had been produced across the whole factory. That person knew it was a lie; that person would pass it up the chain to the Commissar of Stockings or whatever the person was. And everybody in the system understood that it was a lie. And it just kept going. Guest: But you know, at least in my research that I did here, nobody wants to call it a 'lie,' though. Because as soon as you bring up the word 'lie,' then suddenly the ethical hackles go up. But if you call it something else--'our best guess' or 'what they wanted' or 'feeding the beast'--then that's more acceptable. Because you talk to any army officer and say, 'Do you lie?' 'Of course we don't lie.' A more common example of this whole thing is, we do our officer evaluations. Usually officers get rated once a year. Russ: By whom? Who does the evaluating? Guest: Usually the rater, and then a senior rater. So, it's the boss, and then the boss's boss. But before the rating period starts, you are supposed to sit down with the rated person and say, 'Okay, let's go over your goals and what we expect of you.' And then you are supposed to sit down with that rated person every quarter. So every three months they are supposed to sit down. Well, that's really hard to do. And so what happens is, is that thousands of forms get turned in; and you have to initial every time you got counseled, from the beginning and each quarter. But nobody ever gets counseled, and yet initials get put there, signatures get put there, and dates even get put in--when I counseled the person. And it's become routine that--just put in a date as long as it's not on a weekend. Russ: Because that would show that it's a lie. Guest: That it's really not true. But everything else is acceptable. And so it's a very strong example of how we just say, this is the way you do it. Again, it's not because people are looking for self-advancement. It's just because to get it through the system you have to fill in these boxes and initial.
31:27 Russ: So let me push back against the worries here. Let me suggest that maybe this is not as big a deal as you are making them. Really, what is the big deal? So the forms get filled out incorrectly. Is it really--it's a paperwork-heavy system, which is a tradition in the army that goes back a long time. It's always been that way. Obviously people do the best they can under pressure. What's the big deal? Does it really affect a decision that somebody makes in combat? Does it really affect how things get allocated? This is just a bunch of background noise. Guest: Right. And that was always the fear in the back of my mind, saying: Is this just the way things should be? This is the way things have always been, and is this the way things will always be, and is there just an acceptable level of dishonesty that doesn't hurt the profession? But then what I started realizing is there are several factors that are kicking in that make it so this is not acceptable. First, we already went through the other ones--who gets to decide what is honest or what is dishonest, or where do I get to lie and I don't get to lie. We don't ever talk about that; we don't ever talk about the slippery slope of how my lie is now used as justification for someone else's lie. Or the fact that it covers up self-interested lying. But what's going on in today's army is that two things are critical. One is that we are undergoing a downsizing. And when you undergo a downsizing, there's pressure on keeping the best. And that pressure is felt by the force. And so suddenly competition goes up: zero defects start kicking in. And people don't want to look bad compared to their peers. And so, what that does, is it accentuates this drive toward perfection, this drive toward not being totally up front with the way things really are and then sending this masquerade. The other thing that's kicking in is, the army is very digital. It's become digitized, so we use--it's so easy right now, if I say, 'How many people have done this?'--now get out your ID card, digitally sign it, and then send it back. And so that becomes very easy to get compliance. And so what you see is our system looks for that compliance verification very quickly and so now today's officers are deluged with sign this, sign that, comply, verify compliance with this. And so, the numbing is getting much more accelerated than in the past. In the past we didn't sign that many things. We weren't asked to verify so many things, because we didn't have the technology. You put in that technology being asked to verify so many things so fast along with the competition, it's a lot of pressure on the bottom officers to fall into 'Well, I'll just tell them what they want to hear.' It's not getting better. It's getting worse. Russ: One of the reasons I wanted to talk about this on EconTalk is, again, in any organization, whether it's in the government or a private organization, it's always a challenge of implementation and follow-through, and monitoring, and compliance. And I think it's very tempting to just assume it's not relevant. A friend of mine was telling me that there was a new curriculum put into her school for math, and then of course the people in her school wanted to evaluate whether it was successful or not, if test scores go up or down. They just assumed that once the new teachers--and old teachers--had been trained in the new curriculum, that would be the curriculum. But some of the teachers didn't like the new curriculum and they just didn't bother to teach it. They used their old techniques--with the new book, with the new exercises. But that wasn't what they were used to. They didn't like the new stuff, so they were just, 'Oh, come on.' It's like when the New Math came along, a lot of older teachers just said, 'This doesn't make sense to me. And my students--of course I'm doing it for the students. I'm not going to confuse them with these new techniques; I'm going to use the old ones.' So when you go to evaluate efficacy or effectiveness, and this is a real issue: stuff does get digitized and we get these, I think overblown, claims for the power of data, if the data aren't accurate--forget the biggest issue that we talk about on here a lot, which is the complexity issue. It's still going to be difficult to disentangle causal effects, I don't care how much data you have. In fact the problem sometimes even gets worse because you have more data, more types of data, and it's very difficult to know what's important and what you can ignore. But the other issue is, are the data accurate or not? And if it's systematically biased, of course you are not going to get accurate measures. You let people trumpeting the success of the program: 'Oh, we put this new thing in and we have 100% compliance.' But if it's not accurate, you are not going to measure what you think you are measuring. Guest: That's exactly it--that there are indicators, not only are they not reflecting what is really going on, but are they even measuring what you think is going on anyway. And so one of the recommendations--we have three recommendations in the study. And the final one is that you need to lead truthfully. And that means at the highest levels is, maybe you can't rely on those statistics, you can't rely on those indicators, you can't rely on the briefing charts. Or that you might have to say, we'd like 100% but maybe we'll be satisfied with 85%. But it also means that maybe you can't measure everything. Maybe sampling is the answer. Take a sample and see what's going on in the forest. Or audit. Or, like they used to do in the old days: Send down a leader. Have the leader go down and observe what's going on. But instead of relying on: tell me that everything's fine--sort of like when your kids come in and you ask 'Did you clean your room?' Well, what do you want the kid to say? Another thing could be go look in his room, see if he cleaned it. We need to return to the role of the leader. What we've done is we've allowed the system to substitute for leadership. But the role of the leader is to go down and check. The role of the leader is to make sure things are okay and not rely on indicators so much. But it also requires people on the ground, at the bottom, to tell the truth, and to say, 'You know what? I couldn't do 100%. I only got 89%. But I'm telling you the truth right now, and this is where it is.' We need the courage from the top, we need the courage from the bottom, to tell the truth. Russ: Let's talk about all three recommendations. So they are: Acknowledge the problem, exercise restraint, and lead truthfully. Let's talk about them. Guest: So, acknowledge the problem. This is a very hard thing to talk about because none of us wants to say, 'Yeah, I don't always answer truthfully,' or 'Yes, I've been content with a lower standard because I just don't think that downloading songs from YouTube is unethical.' We convince ourselves. But if we acknowledge the problem that I do that and have condoned it, then we start to address it. So the first recommendation is: Look, we have to talk about this. As a profession in the army we are just like our society and we need to talk about how we've lowered our standards within our profession here. The second thing is: Exercise restraint. And this is, at the top levels every time there's a concern, high-level leaders cannot create another requirement for those throughout the force to undergo some other kind of training, to have another online course, to do another form to fill out. Even though they are all well intentioned. We need to exercise restraint and start pulling things off the plate instead of keep piling things on the plate. So that's 'exercise restraint.' And then the leading truthfully, that's, at the top again, you need to be content with not 100% all the time at the top but on the bottom be willing to tell the truth and suffer the consequences that might come from saying I didn't get 100% on that.
39:30 Russ: So, I found out about your paper from a member of the military who thought I would be interested in it but wasn't sure if it was a good topic for EconTalk. And yet, as listeners can tell, I'm fascinated by what you found. And it ties into so many different aspects of topics that we deal with here. Just an unusual application. One of those is how culture emerges. It's not something that we steer or create from the top down. It emerges from the bottom up. And you talk a lot in passing in the paper about just how those incentives are there, for people to comply with a culture that is not fully honest and is actively dishonest at times. But the flip side of that is that changing a culture is very difficult. And once this has become standard operating procedure to fudge or explicitly lie about what's going on, you find, just to take your recommendations, those are not going to fly by themselves. If you sent this out to every officer in the army, they are not all going to say, 'Yeah, I'm going to change. This is a mistake.' It might have some effect--I'm not saying it's a waste of time. But the question of how you get from here to there, how you get back some of that honesty and trust, is a very challenging one. And I wondered if you've thought about it. Guest: That's exactly right. Because you have to be up front with people. Because everyone has this drive to not be the one who stands up and says, 'All right, I'm going to tell the truth,' and the other 9 out of 10 people don't say anything. It requires everybody to jump on board. And so that was a big concern--is that, maybe nothing will happen, because the culture is so strong to appear to be the person that complied with everything, that we'll never change. But I've been extremely encouraged that this study has been embraced by the highest level of leadership in the army down to the bottom. And that's what's required, is that it can't be the top just saying, 'Okay we need to change' and there'd be theories, use[?] and theories espoused: We can't do that any more. And I was concerned that just the bottom would do it, but nothing would change at the top and then you'd have a lot of casualties. But it doesn't seem to be that situation right now. What it seems to be is that it's being embraced by everybody. Everyone's working together saying, 'Okay, look, we can't let our profession go down this path. We need to take care of it now.' So I'm more encouraged. Russ: It seems to me the challenge here is there are two cultures side by side, one of which is, I think, in the head of an officer, 'well, there's my real leadership, which is what I do on the actual mission, what I do when I'm actually training my troops on how to deal with their weapons.' These are life and death issues; you don't fool around with those. Then there's the, 'I've got to keep the boss happy, I've got to fill out the paperwork. And sure, on that, I'm not going to really do it as well as I should, but I can't and it doesn't matter and it's not important'; all the rationalizations we've talked about. One of the things that intrigues me is this interaction between these two cultures. So you have, on the one hand, this culture of trust and integrity which has to be rock solid or the whole thing falls apart. And that's the, in the heat of battle, when there's life and death issues, there has to be 100% compliance as much as is humanly possible. There's going to be errors, of course. But the last thing you want in that situation is strategic negligence, strategic dishonesty. There has to be total honesty. It doesn't mean people don't make mistakes in the heat of battle; of course they do. But there has to be a belief that the people who are leading you are doing so with the highest level of intention of doing it correctly, and the people who are following are doing the same thing. And yet, at the same time, side by side, you have this other--you've got to compartmentalize it, it seems to me, as an officer. You've got to say, 'This is different. That's not me.' Guest: That's right. Russ: That's the part when they-- Guest: That's exactly it. I refer to that in the study that there's an alternate reality. And the army has two competing identities. One is the profession. That's the one you are talking about right there about the trust, this is [?] of the troops. The other one is, the army is also a bureaucracy. And that is, we need the control, we need to make sure these things happen. And those two identities often clash. The problem is, is that we live most of our time in the bureaucracy. That's where your thought patterns and your culture gets developed. And we kid ourselves if we think that when we shift over to combat, that suddenly everything that you've been living with, all the ethical fading and the rationalization suddenly all goes away and suddenly I'm a different person to live in this other identity of the army as a profession. So what this says is, we understand what we need is the trust, but we need to extend that trust to all of the identity that we have. That, you know what, when we are not in combat, we don't want to exercise ethical decisions that are wrong. We need to start exercising the same persona across all spectrums, all identities of this army. Russ: Well, Adam Smith in The Theory of Moral Sentiments wrote that 'Man naturally desires not only to be loved, but to be lovely.' We want people to respect us. We want them to honor us; we want to have a good reputation. And we want to be lovely--we want to be praiseworthy, we want to be worthy of respect, worthy of honor, worthy of the reputation of being a person of integrity. So it strikes me that in your initial interactions with your officers, when you confront them with the possibility that they're dishonest, of course they are going to respond saying 'of course I'm not.' Because their self-image, which comes from that other side, the profession side-- Guest: Exactly-- Russ: has to overwhelmingly always be that 'I'm a person of integrity because I have the lives of 150 people in my hands.' And I wonder about how you push the honorable side--and people will self-deceive, relentlessly, as you saw. They will tell you that they didn't do anything wrong, that they had to do it; I didn't have a choice. And as you point out, they are in a tough situation. They in some sense really didn't have a choice: they had to at least--they couldn't do all the training. The question is whether they are going to lie about whether they did the training or not, and the answer is, sometimes they will. So the question is, how do you push loveliness into both parts of the experience rather than hoping it can be compartmentalized or be involved with self-deception? And it strikes me--and I've talked to some members of the military about this in other areas, where they are struggling with compliance of various kinds--to lean on--Adam Smith talks about the impartial spectator: What would somebody say looking over my shoulder who wasn't me, who didn't have a stake in the matter? And I think that persona, that strategy, is what keeps the professional side of the army at the highest level. It's the understanding that there's a code of honor; there is a code of integrity. And I don't know if it's possible to keep them compartmentalized that way. And I think obviously you are trying to do something about that. Guest: Right. And I think we started our conversation with, as I try to point out, that the whole army profession is based on a foundation of trust--trust within it, but also trust of the American society. And it will not be a profession if we lose the trust of the American society. And so we can't ignore the fact that there's an erosion of that foundation on this dark side that none of us want to talk about. This part of the army, the bureaucracy part, the getting-things-done part, the way to do business, letting things slip with the little white lies. We can't allow that to start eroding. That's what we see--it's eroding. It's not like a giant crack in this foundation, but it's a slow erosion. And you know, if we come to grips with it, if we say to ourselves we are not perfect, we are humans, we are susceptible to temptation. We are kidding ourselves if we don't think that. Then I think the ethical foundation gets solidified, and we remove the distinction between the compartmentalization that you are talking about. Because compartmentalization allows us to avoid the glare of the ethical spotlight. Because we push it off in that little dark box and say, 'ethics does not apply in that world' and we have to say, 'yes, they do. Everywhere.' Russ: And another way to see this--and this is very true in capitalism generally, is, you want people--we had an interview with David Rose on this. He emphasizes the importance of not exploiting every opportunity. We want to live in a world where the people we are dealing with are not going to take advantage of us even if they can. We understand there's a temptation to. But I think the risk of the compartmentalization is where you draw the compartmental boundary is going to change over time. Guest: Or by individual. Russ: Or by individual. They are going to have different places where they draw it: they are going to say, 'This is okay because everybody does it,' or 'this is okay because we are doing it for the troops,' when in fact-- Guest: Or we might be kidding ourselves. Russ: Yeah, exactly. And so it's much better to go in the other direction and relentlessly avoid dishonesty and be a whole person. Right? Guest: Right. Exactly right. Russ: You talk about hypocrisy: the real danger of hypocrisy is you fool yourself into thinking there's none. And then where that boundary is, is very malleable. Guest: And even--integrity means being one, being one person. And that's all this is: get rid of the compartmentalization. Be one person; and that's an ethical person. And so--but you know, we've convinced ourselves, because we listen to a lot our professional self talk about how we are above--we should strive for that. We are above, but we are still human.
49:57 Russ: Now, there've been a number of scandals in the military in the last 5 or 10 years. Do you have any reason to think that there are more than there used to be? And I'm thinking about, in general, and you concede at one point in the paper, mandatory requirements that are impossible to comply with--that's an old story. It goes back a long time in the military. In some sense, there's nothing new under the sun. Some of your findings, maybe all of them, they're the same as they might have been 50 or 100 years ago. Do you have any reason to think that it's getting worse and that that has in turn led to some of the behavior that we see that is inconsistent with the army's self-image, people's image of themselves? Guest: It's hard to say if it's gotten worse. Because the army hasn't stayed the same. What we see with the all-volunteer army is a gradual shift towards a professional army. In other words, when you have a draft army, then we are flat out a reflection of society. But with a professional army, there is a conscious effort to police itself. And so, if it's gotten worse--I'm not sure that's the right word, but what's happened is we've allowed this erosion to occur, and we need a correction. We need a self-correction. And that's part of what professions do, is they correct themselves as opposed to somebody else coming in and saying 'You guys are out of control.' And so all this study was, was an attempt to say, 'Look, as a profession we need to self-correct.' It's really hard to say whether it's worse than before. But in my mind I think there's been a steady increase in this erosion, and we're due a self-correction. Russ: But do you think the reporting requirements are worse than they were 20 years ago? Guest: Yes. Because of, first of all the army is a cumulative organization. We never take anything off. In other words, even if the requirements aren't needed any more in a specific area, it's very difficult for the army to say, 'Let's get rid of that requirement.' The other reason is technology--it's so much easier now to verify through digitization than in the old days when we had rosters or face-to-face contact. And so, you put those two things together and there's a lot more compliance-oriented issues going on in the army today. Russ: Yeah. The cost of communication has gone way down, right? You don't have to send a letter, you don't have to--you can send an email now. Which is so much easier. Guest: Right. How many people--all it takes is a staff officer sitting someplace saying 'How many people did this?' And it becomes a requirement. So, it's so easy to create requirements today than it used to be.
52:48 Russ: So, let's imagine two solutions here. Solution Number One is you tell all officers, 'You only have to comply with 80% of the stuff that you do, that you are told to do, and you figure out which are the most important 80%.' Or, the people at the top could say, 'Boy, we've really overburdened these folks. Let's cut the requirements down to 80% of what they are'--which seems like the better system, better solution, to have the people at the top who are using the data to say, 'You know, maybe we've asked for too much stuff; let's cut back.' Would there be any vague possibility that there would be a consensus about what could be got rid of? For example, if you say to me, 'The government spends too much money; you've got to cut 20%.' I could do that relatively easily in ways that would be politically impossible but I think not so bad for the nation as a whole. I know what I could get rid of. That 20% would be easy; 50% is a little more challenging. But 20% I could do. Do you think the people at the top, the Pentagon and elsewhere, could figure out some of the requirements--you say they never take anything away. But they could[?]-- Guest: It's time. Russ: It's time-- Guest: [?] a way-- Russ: Would there be some consensus? Guest: Can it be done? It's possible. The encouraging thing is they are starting to do that now. In other words, I've had people from the Pentagon contact me saying, 'Look, back when you did your study 10 years ago, how did you come up with so many requirements and what methodology did you use?' and I had to give that to them. So, right now there's people sitting down saying, 'What have we created? What have we put on the backs of company commanders? When you look at the totality of all the requirements we put on people, what is it?' So, they are trying to get a handle on it. Down at the bottom there's discussions going on everywhere--on, can we tell the truth on what we actually have complied with? And so you put the two of those together--the people at the top reducing the burden, people at the bottom saying, 'Look, I'm going to tell you the truth. You might not like it, but I'm telling you the truth; that's more important.' You put those two together, I think you'll start to hear cracks in the culture. Russ: Did some of your sessions include people from the bottom and the top? Guest: No. Russ: Because that could be an interesting-- Guest: I'll tell you what: If you want a silent session, yeah, put the bosses in there with the people. Yeah, who's going to in front of the bosses say, 'Yeah, all the briefings we gave, we always gave it a green; it was really red.' Nobody would say that. Russ: But what I could imagine is a session with the bosses and the officers on what would be the most important things that they could honestly respond to. Guest: Yes. Right. Russ: So it seems to me one way to change the culture would be to convene a host of these where people would talk--maybe they have to wear masks. And I'm not kidding. Guest: Yeah, you could do it. Maybe online. Russ: Online. And say, 'These are the things I never tell the truth about because of this, this and this. Here's what I'd be willing to tell the truth about if you changed the way you did it.' Guest: Right. Right. That would be a way. But you know, there will never be a consensus. And that's what--the nice thing about the army, it's built on leaders. It's going to require a leader to make a decision. Because you'll never get a consensus. So, somebody-- Russ: Right. Somebody's got to take a stand. Guest: Exactly. Somebody's got to take a stand, say, 'I've got to risk my political capital by telling this political appointee who said we really got a problem with this in our society; you need to put this kind of training in the army--I have to tell them we'll do that next year. Or, we'll do that every other year. Because right now we can't do that; it's not a good time.' Russ: It's never a good time. Guest: Right. And so it's going to require someone to expend some political capital.
56:36 Russ: So, you say it's been well received. Guest: Not from the very beginning. From the very beginning, I got a lot of pushback. Because people just read the headlines, where it said 'Army Officers Lie.' Russ: When you say a lot of pushback, is that 4 emails or 40? Guest: A lot of, 'Hey, you guys have really created a firestorm.' And so, not 40 emails. But enough. But then when people started reading the actual study, then things changed. And then gradually the senior officers then started pushing the study. And then things really changed. Russ: Talk about the U.S. Army War College. What is it exactly? It's not--is it a place where you go to study? To get a degree? Guest: Yeah. I'll tell you. This profession, you have people entering as far as officers. They come in after graduating from college, ROTC (Reserve Officers' Training Corps) or West Point. And they go to a basic course that teaches them how to be an officer in whatever specialty. That usually lasts about 3 and a half months. Or actually--yeah, three and half months. And then they move on from there, after about 5 years, they go to another course that lasts usually about 6 months that gets them ready for the next level of leadership. Then, they hit about 10 years, they go to a year-long school out of Fort Leavenworth, Kansas, that gets them ready for mid-level management. By the time they get here, they are 40-year-olds; they are lieutenant colonels and colonels, and they are ready to move on to the strategic level of leadership. And so, they get a Master's Degree here. And what it prepares them to do is start not just looking at military but also diplomatic, informational, and economic instruments of power, and national security issues. Russ: And what do you teach there? Guest: I'm a researcher. So I supplement the teaching here by looking at strategic issues. Russ: And do you have interaction with the students? Guest: Yes. I have interaction with the students. I give talks; and they come in and we bat things around. I'm always here to support them. But my primary role here is research professor. But I do a lot of talks. Russ: And how many faculty of either kind, research or teaching, are there? Guest: Let's see. For research, we've got about a dozen. And they cover everything from our Asia specialist to people who look at the strategy. And on the teaching faculty, I don't exactly have a handle on it. I think it's like 250. Russ: How old is it? Guest: This is the second-oldest continuously manned army post in the Army. The first one's West Point. So, Carlisle Barracks, Pennsylvania has been here for a while. The actual War College, it goes back to the mid-1800s. You see early stages of it. But I think if you really look at it goes back to the 1900s and it started in Washington, D.C. and moved to here. Russ: Do you ever bring in people from other armies outside the United States for training? Do they come there? Guest: I think right now we have 60 international officers in the class. Russ: Wow. Guest: So, we have a lot of international officers here. Russ: And your career path was--you were an officer. Guest: I graduated from West Point, and was in the Army for 20 years; and then I became a research professor here. Russ: And you went out and got a Ph.D. Guest: The Army sent me for a Ph.D. so I could teach leadership at West Point. Russ: Wow. And what's in your future? I'm glad you have one. When somebody sent me to the paper, I thought, 'Well, this can't be online.' The first thing I did when I received the paper was I googled it to see if I could actually access it legally. Because I just sort of assumed that nobody would publish this publicly. Guest: No, that's [?]-- Russ: That's somewhat embarrassing. Guest: People are--I guess we are a profession. People are surprised that we are willing to police ourselves. And this is one of those--it was essentially a call that we need a self-correction. And that we don't need Congress to tell us to do this. We need to do it ourselves. And that's what professions do. So, this is just part of the army being a profession. It wasn't an underground publication. It was published by the Army War College. It wasn't a secret. It was out there. Russ: And we will put a link up to it, and hope it stays live for a while. What are you interested in next? Is this something you are going to stick with for a while? Guest: No, you know, I'm not an ethics person. I study organizations, my background is organizational behavior. So that's--you probably could hear that from my discussion. I look at the army as an organization; the army as an institution. What topic do I pick up next? Russ, if you've got a good one, just let me know. I'm always look for what angle I'm coming in on next. I focus a lot on developing leaders and on our producing the right type of leaders, how are we interacting with society, and things like that.