It’s easy to assume that Jamie Dimon, the accomplished CEO of JPMorgan Chase, was simply stonewalling when he claimed that mounting trading losses in his bank were “blown out of proportion” — that he was unaware of how grave the situation really was. But I believe that Dimon literally didn’t perceive disaster unfolding before him. Consider that after failing to notice losses that would eventually mount from $2 billion to $5 billion, Dimon reportedly confided to his wife that he had “missed something bad.”
To understand Dimon’s blindness, let’s look at a quick history of the trading debacle.
In 2005, Dimon hired Ina Drew to head the company’s Chief Investment Office, the unit responsible for the bank’s risk exposure. In 2011, the company dropped its requirement to exit investment positions when losses exceeded $20 million. It appears that Dimon was not aware of this change, and had paid less and less attention to the unit after watching a stream of large profits roll in.
On April 4 of this year, Dimon read a short article in the Wall Street Journal about a JPMorgan trader in London, Bruno Iksil, who was making massive bets that exposed the bank to high levels of risk. Dimon likely would not have approved of the bets if he had known they were occurring.
At a meeting on April 8, Drew assured Dimon and the operating committee of JPMorgan that the trades were being well managed and would work out. She claimed the Wall Street Journal story was “blown out of proportion,” which led Dimon to refer publicly to the trades as a “complete tempest in a teapot.”
Yet large losses started to mount as a result of Iksil’s trades, and Dimon grew angry at himself “for failing to detect the group’s exposure,” according to the Journal. As the losses piled up, Drew’s group provided only summaries of the trades.
Finally, on April 30, Dimon demanded to see the specific trading positions. Upon viewing the entire set of complex transactions, he realized that a huge problem existed. As the second week of May began, Dimon realized, “The last thing I told the market — that it was a tempest in a teapot — was dead wrong,” the Journal reports. Dimon publicly disclosed the losses in a conference call on May 10, and he accepted Drew’s resignation soon after. JPMorgan Chase’s stock fell by $25 billion in the process — that is, by far more than the actual investment losses – and the firm’s reputation for integrity and smart management suffered a major blow.
Just four years earlier, in 2008, the European bank Société Générale realized that one of its traders had lost more than $7 billion through a series of fraudulent trades. In this case as well, the losses occurred without necessary controls in place and without the awareness of senior executives.
Strikingly, the executives involved in these cases did not make the classic decision-making errors that have been so well documented in the fields of behavioral decision research, behavioral economics and behavioral finance. These fields focus on how decision makers fail to optimally integrate the available information set to make a rational decision. Dimon and his colleagues, and the executives at Société Générale, committed a different kind of mistake: They failed to notice the absence of critical information. That is, they had enough hints so that they should have realized that there was important information that was not in front of them. In 2005, Professor Dolly Chugh of the Stern School of Business at New York University and I coined the term “bounded awareness” to refer to the systematic human tendency to fail to perceive and process important information that is easily available to us.
Many recent crises cannot be explained by the misuse of information, as has been so well described by leading books on decision making (notably, Dan Ariely’s 2008 book Predictably Irrational and Daniel Kahneman’s 2011 book Thinking, Fast and Slow). Rather, these crises occurred due to the failure of key professionals to notice important information in the first place.
As it turns out, this type of failure is quite common:
- Many NASA and Morton Thiokol managers failed to notice the obvious data suggesting it was too cold to launch the Space Shuttle Challenger in 1986.
- Many Arthur Andersen employees overlooked the fact that Enron’s financial reports were fraudulent.
- Many parties, from investors to regulators, failed to recognize that Bernard Madoff’s claimed returns were impossible.
- Many leaders in the Catholic Church and at Penn State turned a blind eye to the abuse suffered by children under their watch.
- And few people foresaw that the US housing market could collapse and trigger a global financial crisis.
Very smart people simply didn’t notice these brewing problems, despite evidence being readily available — but missing — in each of these cases. When organizations and systems appear to be performing well, when problems develop slowly over time, and when a variety of systematic lapses occur, even the best and the brightest simply do not notice gaps in information that would indicate a looming crisis.
I am currently writing a book, How Could I Miss That?, to be published by Simon and Schuster, which will examine the common failure to notice critical information due to bounded awareness. The book will document a decade of research showing that even successful people fail to notice the absence of critical and readily available information in their environment due to the human tendency to focus on a limited set of information. This work is still in its formative stages, and I welcome emails (mbazerman@hbs. edu) about how bounded awareness affects you and your organization and how you have created solutions to such problems.
NOTE: Originally published by Harvard Business Review in 2012. Copyright 2012 Harvard Business Review. All rights reserved. Reprinted by permission.
As Bazerman points out, many decisions are made without all of the facts. Even with a strong analytics framework to rely on, many executives and boards must still rely on reports or the information given to them by trusted senior managers. Read this blog post by Clark Abrahams for advice on getting the information that you need.