Skip Navigation Skip to Content Skip to Footer

What I learned from my first Oxford case study

January 31, 2019

Michael Zhuang is an EMBA student at Said Business School. He is also principal of MZ Capital Management, a wealth advisory firm in Washington DC.  Here he writes about his experience of his first case study at Oxford and gives an insight into the consequences of decision making. 

Yesterday I had my first Oxford case study. It was an epic fail and I want to share with you what  I have learned from that.

The case study was a mock group decision under time pressure. The decision was whether or not to withdraw from a car race due to cold weather. The group was supposed  to come to a decision within 45 minutes.

There was rather detailed information about the payoffs and the drawbacks of various scenarios, such as joining the race and finishing in the money (racing lingo for among the top five finishers), joining the race and the engine failing and withdrawing from the race out right.

Engine failure was a particularly large risk since it had occurred a number of times in previous races and the mechanic had a hunch that it was due to cold weather. However, a quick scan of the scatter plot shows those incidents happened rather randomly across the temperature spectrum.

When the group meeting started, I suggested that we take a quick poll after reading the case. To get the ball rolling, I volunteered my choice: I would absolutely join the race. Other group members came in either strongly or marginally in favour of joining the race.

Noticing that we had too much consensus, we assigned two members as devil’s advocates to persuade us why we should not join the race.

They went into detail with probability analysis and calculated the odds of each outcome and the expected payoff. The end result did not change any minds. In the end, we reached a consensus to join the race.

So far so good, right? Or so it seemed. Alas, if we join the race, the chance of engine failure is near 100%, based on historical data. With all the smarts in our group, we came to the wrong decision. The reason is that, even though engine failures occurred across temperature spectrum, when the temperature was under 65, it  happened 100% of the times. As the temperature increases, the odds of engine failure also diminish. Of course, this information was not provided to us initially, otherwise we would have arrived at the right decision, but all we needed to do was to just ask the professor for it and yet none of us thought of that.

This epic fail reveals several inherent biases in decision making:

Confirmation bias: I am an entrepreneur and, as natural risk taker, I am not someone who would shrink from a race just because of the chance of failure. I really want to join the race, and all information I see just confirms my inclination. I never looked for information that ran counter to my inclination.

Accessibility bias: We tend to make decisions based on information that is readily available, and ignore information that is not. All we need to do was asking for it and yet we were satisfied with whatever was provided to us.

Anchoring: Though my intention was good to get the ball rolling, I shouldn’t have volunteered my strong opinion. That killed the chance of a more diverse discussion, especially since I was the oldest one in the group and appeared to be a leader of the group.

Herd mentality: Once a consensus is formed, it’s very hard to be the renegade. I can sense some younger members of the group just wanted to go along.

In my job, I deal with my wealth management clients’ biases day in and day out, so I am very familiar with everything on that list. And yet I still fell for them.

At the end of the class, the professor revealed that the case was based on a real life decision NASA officials and engineers made that eventually sent seven American Astronauts to their death, on national television, in the space shuttle Challenger mission.

Yes! Our biases can have deadly consequences!