Categories
Economics Financial Crisis The Opinionsphere

Theories of the Financial Crisis: Misjudging Risk

[digg-reddit-me]The bankers – whose enormous salaries were earned based on their skills at judging risk and making money – caused a financial cataclysm because they disastrously misjudged the riskiness of the complex financial instruments they created and sold.

This was the lesson I learned in the immediate days after the financial crisis – and it still explains a great deal of what happened. (Of course, there was also a good deal of outright fraud and the perversity of short-term incentives in which bankers could profit exorbitantly if they made profits regardless of how their investments turned out over the long-term.)

Cognitive errors may have contributed to the misjudging of risk. Megan McCardle for example gave a compelling description of different cognitive errors which contributed to the financial crisis – including the recency effect which she describes:

People tend to overweight recent events in considering the probability of future events.  In 2001, I would have rated the risk of another big terrorist attack on the US in the next two years as pretty high.  Now I rate it as much lower.  Yet the probability of a major terrorist attack is not really very dependent on whether there has been a recent successful one; it’s much more dependent on things like the availability of suicidal terrorists, and their ability to formulate a clever plan.  My current assessment is not necessarily any more accurate than my 2001 assessment, but I nonetheless worry much less about terrorism than I did then.

These cognitive errors were so damaging because they were programmed into the models for minimizing risk that the “quants” created to divvy up mortgages and just about everything else that could be bought and sold. 

Michael Osininski tried to claim some share of the blame in a recent New York magazine article – as one of the top “quants.”  Osininski described how he had an inkling of the disaster ahead:

[T]he world I had helped create started falling apart. I hadn’t anticipated it, but at the same time, nothing about it surprised me.

Last month, my neighbor, a retired schoolteacher, offered to deliver my oysters into the city. He had lost half his savings, and his pension had been cut by 30 percent. The chain of events from my computer to this guy’s pension is lengthy and intricate. But it’s there, somewhere. Buried like a keel in the sand. If you dive deep enough, you’ll see it. To know that a dozen years of diligent work somehow soured, and instead of benefiting society unhinged it, is humbling. I was never a player, a big swinger. I was behind the scenes, inside the boxes. My hard work, in its time and place, merited a reward, but it also contributed to what has become a massive, ever-expanding failure.

Jordan Ellenberg described how these models that purported to minimize risk actually just compressed the risk into “one improbable but hideous situation” in a manner similar to that of the 400 year old sucker bet, the Martingale. For example, Wall Street bankers combined hundreds of mortgages into securities in the belief that while some of the mortgages might default – most would not. The more mortgages you combined, the safer the investment was – as only a small percentage of mortgages typically defaulted. Unless something went very wrong. Comparing Wall Street bets to the Martingale, Ellenberg described the bet Wall Street was making:

(0.99) x ($100) + (0.01) x (catastrophic outcome) = 0

Wall Street bankers thought that the collective assets they were trading were worth $99 each in this estimate – rather than $50 as they would be if each asset were judged individually. 

One of the few people who saw this misjudging of risk as the inevitable cause of a financial crisis was Nassim Nicholas Taleb who wrote that Wall Street had consistently ignored the possibility of what he called “Black Swans” and what Ellenberg described as an “improbable but hideous situation.” Taleb has placed a great deal of blame on the mathematical models used by the quants and on the hubris of the bankers and traders who believed that they were created wealth when they were instead building an elaborate house of cards.

While he was running his own hedge fund in the 1990s, he turned his own knowledge of his lack of knowledge – and others’ lack of knowledge – into enormous profits. It came at the expense of losing a little money 364 days of the year – but making enormous profits in that one remaining day. He would bet on market volatility – which he understood financial firms repeatedly underestimated.

Taleb was castigating Wall Street barons for years as they hubristically bet greater and greater sums of money – making leveraged bets that the market would continue to rise.

Taleb’s key insight is that we know very little of the world itself – and will be more often fundamentally wrong than right. The example he uses is the Black Swan as described by David Hume:

No amount of observations of white swans can allow the inference that all swans are white, but the observation of a single black swan is sufficient to refute that conclusion.

This fundamental unknowability of the world must inform our actions, and perhaps points to some solutions. Taleb himself recently wrote a list of steps we should take to create a world more resistant to Black Swans. But his overall philosophy insists that we must attempt to resolve this crisis by tinkering with different solutions, and seeing what works, while being mindful that our actions will inevitably have consequences we do not imagine. And remember – at any point – a black swan could come around and reshape our world suddenly – as 9/11 did, as the assassination of the Archduke Ferdinand to start World War I, as did the invention of the personal computer, as has this financial crisis. The solution will not come from our determined application of fixed ideas, but by our openness to the possibility that we may be wrong, even as we are determined to act. We must see the shades of gray and acknowledge that we do not fully understand the world, yet still act – tinker, if you will. 

In this, Taleb seems to have reached a philosophical end point similar to the famous libertarian economist Friedrich Hayek who in his Nobel Prize speech explained that “we needed to think of the world more as gardeners tending a garden and less as architects trying to build some system.”

To tinker, to garden, to nudge – all of this points to a more modest liberalism, a market-state liberalism.

[Image courtesy of robokow licensed under Creative Commons.]