India's Most Influential Business and Economy Magazine - A Planman Media Initiative 
  Other Sections
  • Home
  •  Cover Story
  •  B&E This Fortnight
  • B School
  • BE Corporation
  • Best of B&E
  • Corporate Philanthropy
  • Finance
  • International Column
  • Interview
  • Overseas Talk
  • Policy
  • Politics
  • Scrutiny
  • Sector
  • Snapshot
  • Stratagem
  • Testimonial

Share |
Cover Story
Go to Page Number - 1   2   
How to build responsible business leaders?
It’s not just character. We also need to avoid ‘moral overconfidence.’
Issue Date - 19/01/2012
Whenever we see examples of leaders who suffer an ethical or moral lapse, our knee-jerk response is to say there was a flaw in their character, and that deep down, he or she was basically a bad person. That view is part of our larger tendency to sort the world into good people, who are stable, enduringly strong, and blessed with positive character, and bad people, who are inherently weak, frail, or malicious.

In this way of thinking, character is an immutable trait. It’s largely formed during childhood and adolescence, with parents playing a key role. People who adhere to this worldview believe character is something hidden inside each of us, waiting to be revealed during adversity or through careful testing.

In my view, this is really the wrong way to think about it.

Instead, I look at character development in the same way we consider the development of intelligence, wisdom, or subject expertise – as a lifelong process. It’s also not as binary as some observers like to think. The world is not divided into “good people” and “bad people.” Most people have the potential to behave well or poorly, all depending on the particular context.

Most people also overestimate their strength of character – a problem I’ve come to think of as “moral overconfidence.” Consider two famous experiments that highlighted how situation and context affect moral behaviour.

The first is called the Good Samaritan Experiment, and it was conducted by Darley and Batson at Princeton University. They took 67 divinity students and divided them into groups to deliver short talks to other students. Half the students would talk about career prospects for divinity school graduates. The other half would discuss the parable of the Good Samaritan. The speakers were given maps marking the location of the classroom where they would give the talk. Some were told they were running late; some were told they were on schedule; some were told they had extra time.

On the way to the distant classroom, the students encountered a man lying in a doorway, eyes closed, groaning. The question: would these students stop to help the injured man, or keep walking? Remember, these weren’t just any students—they were divinity students, who aimed to spend their careers helping people. Also remember, half of them were about to deliver a speech on the Good Samaritan who stopped by the road to help an injured stranger—the exact scenario that they were encountering.

Overall, just over 40% of the students stopped to offer aid. Among those who were running late, just 10% stopped. Of those who were preparing to talk about the Good Samaritan parable, 53% helped the man. The lesson most people draw from this: even among a group of students, you might expect to possess above-average measures of “character,” a high number behaved poorly when placed in the moderately stressful situation of running late for an appointment.

The second research is the so-called Milgram experiments, conducted at Yale University in the early 1960s. Subjects were placed in the role of a teacher who’s asked to help a learner – really a paid actor – to memorise sets of word pairs. Teacher and learner were placed in separate rooms, but they could hear each other via a speaker. Each time the learner made a mistake, the teacher was instructed to turn a dial to administer an electric shock to the learner. With each mistake that was made, the voltage was further increased.

In reality there was no shock being administered, but as the voltage went up, the actor began screaming in pain and banging on the wall. Many of the teachers expressed discomfort giving the shocks, but they were told it was an essential part of the process, and they should continue. Despite their misgivings, 65% of the subjects kept administering the fake shocks to the maximum 450-volt level. Milgram’s research, which was motivated in part by his desire to understand why so many lower-level soldiers helped implement the Nazi Holocaust, offers powerful evidence that obedience to authority – in this case, the person in charge of the experiment – is a powerful force. People will do abhorrent things if they’re directed to and made to feel like they’re just following orders.

While both the Good Samaritan and Milgram experiments highlight humans’ propensity to engage in poor behavior in specific circumstances, it’s worth pointing out that in each experiment, a fair proportion of people did the right thing, even under pressure. The trouble is, people wildly overestimate their own strength of character, and far more people assume that they’d be in this “good” group than really would be. When I teach MBA students or executives about the Milgram experiments, I always ask them to raise their hands if they think they’d have stopped delivering shocks, despite the admonition to continue. At least two-thirds of the people say they’d have had the courage to stop – and in some groups, 80% believe they’d have behaved well. In reality, the experiments suggest that only one-third would behave responsibly.

Behavioral finance has taught us that more people are far too confident about their investing prowess, and the same is true in ethics: most people exhibit moral overconfidence; they overestimate their own strength of character in the face of pressure or temptation.

Neither the Good Samaritan nor the Milgram experiments dealt with business situations, but in fact managers put employees in pressured situations like this all the time. If you give people powerful economic incentives to behave in a certain way – and if they can make a lot of money in the short-run – it can be very hard to resist that temptation. Over the last generation, many organizations have begun to utilize very large “pay for performance” incentive compensation plans, and they can push people to act in ways that may actually go against their normal sense of morality.

It’s not just CEOs with multimillion dollar pay packages who face these temptations. In the early 1990s, Sears Auto Centers decided to pay commissions to mechanics and service advisors based on the number of specific repairs – such as brake jobs – they performed in a given month. It was a cliff-type incentive – either you hit a certain number and you received the bonus, or you didn’t hit the number and received nothing. Guess what happened? By the last week of the month, every customer who brought a car to Sears was told they needed a brake job. Since it’s hard for consumers to know if this is true – they’re not experts – the mechanics reached their targets and received their bonuses. Sears made millions of dollars of unnecessary repairs (which it then lost in fines when regulators eventually caught on).


Share |
Go to Page Number - 1   2        Next

Leave your first comment


     Leave Comments to this story    
Email id:  
Busines & Economy is also associated with :
©Copyright 2008, Planman Media Pvt. Ltd. An Arindam Chaudhuri Initiative. With Intellectual Support from IIPM & Malay Chaudhuri.